CN106778885A - Hyperspectral image classification method based on local manifolds insertion - Google Patents

Hyperspectral image classification method based on local manifolds insertion Download PDF

Info

Publication number
CN106778885A
CN106778885A CN201611219213.9A CN201611219213A CN106778885A CN 106778885 A CN106778885 A CN 106778885A CN 201611219213 A CN201611219213 A CN 201611219213A CN 106778885 A CN106778885 A CN 106778885A
Authority
CN
China
Prior art keywords
class
data
dimensional
image
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611219213.9A
Other languages
Chinese (zh)
Inventor
黄鸿
罗甫林
段宇乐
石光耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University
Original Assignee
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University filed Critical Chongqing University
Priority to CN201611219213.9A priority Critical patent/CN106778885A/en
Publication of CN106778885A publication Critical patent/CN106778885A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24143Distances to neighbourhood prototypes, e.g. restricted Coulomb energy networks [RCEN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of hyperspectral image classification method based on local manifolds insertion, 1)Each data point of training sample is reconstructed using similar Neighbor Points;2)Built using each data neighborhood of a point and the corresponding reconstruction point of each neighborhood point and scheme the reconstruct image and between class in class in figure, class between reconstruct image, class;3)In low-dimensional embedded space, figure is constant with the structure of reconstruct image in class in holding class, and the structural relation of the reconstruct image and between class is schemed between suppressing class, obtains the projection matrix from higher dimensional space to lower dimensional space;4)By projection matrix, the low-dimensional insertion feature of training sample is obtained;5)By projection matrix, by test sample high dimensional data dimensionality reduction, the insertion of test sample low-dimensional is obtained;6)The low-dimensional of test sample can be embedded in by grader and classified, obtain classification hyperspectral imagery result.The present invention can preferably characterize the interior of high spectrum image and accumulate attribute, can more effectively extract diagnostic characteristics, improve data separability.

Description

Hyperspectral image classification method based on local manifolds insertion
Technical field
The present invention relates to classification hyperspectral imagery, and in particular to a kind of classification hyperspectral imagery based on local manifolds insertion Method, belongs to classification hyperspectral imagery technical field.
Background technology
Scientific researchers propose high-spectrum remote-sensing in early 1980s on the basis of multispectral remote sensing.Bloom The spectral resolution for composing remote sensing image is up to 10-2The λ orders of magnitude (belong to nanoscale), wavelength band from visible ray to short-wave infrared, More than spectral band number up to dozens of is even hundreds of, make hyperspectral image data adjacent wave the characteristics of high spectral resolution is high Intersegmental interval is narrower, there is wave band overlapping region, spectrum channel is no longer discrete and show continuous, therefore high-spectrum remote-sensing is logical Often be otherwise known as Imaging Spectral Remote Sensing.High-spectrum remote-sensing can not only solve the problems, such as the identification of atural object major class, and can carry out Subdivision or fine Spectra feature extraction in class.Hyperspectral Remote Sensing Imagery Classification carries out feature extraction firstly the need of to testing data, Dimensionality Reduction is realized, the feature for extracting is classified again then.
Target in hyperspectral remotely sensed image is obtained by imaging spectrometer, containing abundant information, is brought to atural object research new Opportunity.But correlation is strong, redundancy is big, dimension is high, embodying information because target in hyperspectral remotely sensed image data volume is big, between data, Conventional sorting methods are easily caused Hughes phenomenons very much, i.e. " dimension disaster ".Therefore, how to be efficiently extracted out from high dimensional data Hidden feature, reducing data dimension turns into the emphasis that target in hyperspectral remotely sensed image is studied in terms of data processing.
1st, manifold learning
The feature extracting method proposed based on Principle of Statistics mainly uses the statistical nature of data, have ignored number According to geometry be distributed.To disclose the immanent structure of data, researcher proposes the concept of " manifold (Manifold) ", and it belongs to Europe One popularization concept in family name space, each point in manifold has the point with its homeomorphism, that is, manifold in Euclidean space Can be bonded together by substantial amounts of Euclidean space block.Manifold contains topology, mathematical analysis, Differential Geometry, algebra It has been the master tool of modern scientific research Deng subject.
Manifold mathematically may be defined as:Hausdorff spacesIn arbitrfary point xIn x open neighborhood U in Euclidean SpaceIn an open subset belong to homeomorphism,It is referred to as d dimension topological manifolds, i.e. d dimension manifolds.Hausdorff spaces refer to Two data points in arbitrary collection, there is respective open neighborhood, in the absence of common factor between two open neighborhoods.Whitney shows Any manifold can be embedded into the sufficiently large Euclidean space of dimension.
Manifold learning (Manifold Learning) is the data processing method proposed on the basis of manifold, it is therefore an objective to Embeddable low dimensional manifold is found from high dimensional data.The concept of manifold learning be initially by Bregler and Omohundro in Proposed in speech recognition and the research of image interpolation within 1994;2000, two delivered in Science were on manifold learning The paper of algorithm, the research and application for making manifold learning enters summit.The premise of manifold learning is that have one in high dimensional data Potential manifold, is learnt by certain mode to high dimensional data, obtains a mapping relations, realizes data from higher dimensional space To the projection of lower dimensional space, the not inherent feature or geometry of change data original higher dimensional space in lower dimensional space, so as to send out The inherent attribute of existing data.
The mathematical description of manifold learning is:One group of dimension is the high dimensional data X=[x of D1,x2,…,xN], it is assumed that data bit In intrinsic dimension be d (general d<<D on low dimensional manifold), the purpose of manifold learning is to find each high dimensional data pointLow-dimensional insertionSolve a higher dimensional space to lower dimensional space Mapping relations g, makes yi=g (xi), while obtaining reconstruct mapping g-1, make xi=g-1(yi), mapping g should not change under constraints Become the inherent characteristic or geometric relationship of former high dimensional data.
With the extensive use of manifold learning, scholars propose substantial amounts of manifold learning, and classic algorithm mainly has ISOMAP、LLE、LE。
ISOMAP algorithms were proposed by Tenenbaum etc. in 2000, and basic theories is to measure height using geodesic distance The geometric relationship of dimension data, keeps neighbour's data constant in the geometry of higher dimensional space, i.e., in low-dimensional embedded space: In mapping, geodesic distance of neighbour's data in higher dimensional space is not changed, so as to disclose the low dimensional manifold in high dimensional data.
LLE is proposed in Roweis in 2000 and Saul, and basic principle is that the local of nonlinear data is presented linear distribution, The local linear structure that change data is not represented in higher dimensional space by neighborhood linear combination in low-dimensional embedded space, and then take off The inherent manifold of high dimensional data is shown.
LE algorithms were proposed in 2003 by Belkin and Niyogi, by the local similarity of data in higher dimensional space, A similar diagram is built, figure is processed by Laplace operator, in low-dimensional embedded space, the local letter of retention data Breath is constant, obtains low-dimensional insertion feature.The principle of LE algorithms is the data for making distance in higher dimensional space be separated by more remote (or near), In lower dimensional space also from more remote (or close to).
2nd, figure embedding grammar
Figure insertion is a Unified frame for Expressive Features extraction algorithm, can not only unify the feature extraction of most of classics Algorithm, and new feature extraction algorithm can be developed, the structure for differring primarily in that similar matrix and constraint matrix of these algorithms Build mode different.
2.1 figures are embedded in
Figure embedded (Graph Embedding, GE) is certain statistics or geometry that data are expressed using Graph Spectral Theory Characteristic, is operated by Laplace operator to the figure for building, and when low-dimensional is embedded in, retains advantageous information in figure, suppresses figure Middle garbage, realizes feature extraction.In actual applications by build an intrinsic figure come represent homogeneous data statistics or Geometric properties and a punishment scheme for describing non-homogeneous data between statistics or geometrical property, intrinsic figure G={ X, W } and punishment Figure GP={ X, WPNon-directed graph is belonged to, wherein X represents the summit of figure,WithRespectively scheme G and GPPower Value matrix.The i-th row j of W is classified as wijRepresent summit x in figure GiAnd xjBetween side right value, reflect homogeneous data xiAnd xjBetween Similitude, the similarity relation in figure G need to be retained when low-dimensional is embedded in.WPThe i-th row j be classified asRepresent figure GPMiddle summit xiAnd xj Between side right value, indicate non-homogeneous data xiAnd xjBetween approximation, figure G need to be suppressed when low-dimensional is embedded inPIn it is approximate Relation.
According to figure embedding principle, object function may be defined as:
In formula, h is constant, and H is constraint matrix, to eliminate degenerate solution, H is generally set to unit matrix, and data are entered Row normalized, H are it can also be provided that the Laplacian Matrix of punishment figure, LP=DP-WP,It is diagonal matrix, andL=D-W is the Laplacian Matrix of intrinsic figure, D =diag ([d11,d22,…,dNN]) it is diagonal matrix, and
Mesh function can transform to:
Under linear case, Y=VTX, then scheming embedded object function can be expressed as:
2.2 border Fisher are analyzed
Under the embedded framework of figure, Yan etc. proposes MFA algorithms, is schemed between figure and class by building in class, uses up homogeneous data Possible aggregation, non-homogeneous data as far as possible away from.Similarity relation between scheming in class for disclosing homogeneous data, can promote class The aggregation of interior data;Figure belongs to punishment figure between class, for suppressing non-homogeneous data between similitude, data between class can be strengthened Separation property.
Fig. 1 is the principle of MFA algorithms, illustrates the structural relation of figure between figure and class in class.In class in figure, connection is per number Strong point is (such as:Point x1And x2) with it from similar Neighbor Points, it is therefore an objective to when low-dimensional is embedded in increase homogeneous data aggregation Property.Between class in figure, in each data point (such as:Point x3) and its from side is built between different classes of Neighbor Points, can be low Dimension strengthens the separability between non-homogeneous data when embedded.
In figure in class, only have side between similar neighbour's data, and by setting the weights on each side to represent data between Similitude, xiWith xjSide weight w 'ijMay be defined as:
In formula, liWith ljRespectively xiWith xjClass label.
Between class in figure, only just there is connection side between non-similar neighbour's data, the approximate journey between non-homogeneous data can be reflected Degree, data point xiWith xjBetween weights be:
In low-dimensional embedded space, keep the similitude between similar neighbour's data constant, and assemble like numbers as much as possible According to can obtain object function.
In formula, L'=D'-W', D=diag ([d '11,d'22,…,d'NN]) and
In addition, the similarity relation between non-similar neighbour's data should be suppressed in low-dimensional embedded space, and make non-homogeneous data Between as far as possible away from then having:
In formula, LP'=DP'-WP',And
According to formula (6) and (7), optimization aim can be converted to:
By method of Lagrange multipliers, the optimization solution of formula (8) is represented by:
XL'XTV=λ XLP'XTV (9)
The generalized eigenvalue of ascending order arranged type (9), takes the corresponding characteristic vector composition mapping matrix V=of preceding d characteristic value [v1,v2,…,vd]。
Although MFA strengthens the aggregation of homogeneous data and the separation of non-homogeneous data by building in class between figure and class figure, But when two figures are built, the neighbour structure of data is only accounted for, for the high spectrum image that there are a large amount of homogeneous regions, no Can effectively characterize data inherent manifold, and then do not reach preferable classification results.
The content of the invention
For MFA can not the effectively inherent manifold of characterize data deficiency, can be more it is an object of the invention to provide one kind Characterize well high spectrum image in accumulate attribute, can more effectively extract diagnostic characteristics, improve data separability based on The hyperspectral image classification method of local manifolds insertion.
To achieve these goals, the technical solution adopted by the present invention is as follows:
Hyperspectral image classification method based on local manifolds insertion, it is characterised in that comprise the following steps:
1) the fixed training sample X=[x of selection class label1,x2,…,xN], liIt is xiClass label, to training Each data point of sample is reconstructed using similar Neighbor Points;
2) built using each data neighborhood of a point and the corresponding reconstruction point of each neighborhood point in class reconstruct image in figure, class, Reconstruct image between figure and class between class;
3) in low-dimensional embedded space, keep figure in class constant with the structure of reconstruct image in class, between suppression class between figure and class The structural relation of reconstruct image, obtains the projection matrix from higher dimensional space to lower dimensional space;
4) by step 3) projection matrix that obtains, by the high dimensional data dimensionality reduction of training sample, obtain the low of training sample The embedded feature of dimension;
5) by step 3) projection matrix that obtains, using high spectrum image to be sorted as test sample and by test specimens This high dimensional data dimensionality reduction, obtains the insertion of test sample low-dimensional;
6) according to step 4) the low-dimensional insertion feature of the training sample that obtains and combine the grader chosen, you can to testing The low-dimensional insertion of sample is classified, and obtains classification hyperspectral imagery result.
Step 1) be using the method that similar Neighbor Points are reconstructed to each data point of training sample,
To each data point of training sample xi, k is chosen from from similar data1Individual Neighbor Points reconstruct xi, reconstruction point For
In formula, sijIt is data point xiWith xjBetween reconstruct weights, andsi=[si1,si2,…,siN ]T;If xiWith xjIt is similar neighbour, sij≠ 0, otherwise sij=0, it is defined as:
In formula, normalized value
Step 2) in the structure of reconstruct image is as follows between figure and class between reconstruct image, class in figure, class in class,
Build figure G in classw={ X, Ww, X is the summit of figure, if two summit x in figureiAnd xjBelong to the k from homogeneous data1 Neighbour, then in xiAnd xjBetween build one connection side, otherwise, xiAnd xjBetween it is boundless, the weights on side areRepresent xiAnd xjBetween Similarity relation, be defined as:
In formula,Parameter
Build reconstruct image in class It is the summit of figure, if xiAnd xjBelong to the k from homogeneous data1Closely Neighbour, then correspondingWithBetween build one connection side, otherwise,WithBetween it is boundless, the weights on side areRepresent WithBetween similarity relation, be defined as:
In formula,Parameter
Scheme G between building classb={ X, Wb, X is the summit of figure, if two summit x in figureiAnd xjBelong to the k of non-homogeneous data2Closely Neighbour, then in xiAnd xjBetween build one connection side, otherwise, xiAnd xjIt is not connected to, the weights on side areRepresent xiAnd xjBetween it is near Like degree, it is defined as:
In formula,Parameter
Build reconstruct image between class It is the summit of figure, if xiAnd xjBelong to the k of non-homogeneous data2Neighbour, Then correspondingWithBetween build one connection side, otherwise,WithIt is not connected to, the weights on side areRepresentWithBetween Degree of approximation, be defined as:
In formula,Parameter
Step 3) in projection matrix determine as follows,
For data in class, do not change figure and the similarity relation of reconstruct image in class in class in lower dimensional space, make like numbers According to and its reconstruct data flock together as far as possible, and then reduce the difference between homogeneous data, object function is expressed as:
In formula,WithIt is diagonal matrix, andS=[s1,s2,…,sN], si=[si1,si2,…,siN]T
For data between class, the similitude between suppression class in lower dimensional space between figure and class in reconstruct image between data is separated Non- homogeneous data is opened, and then increases the difference between non-homogeneous data, object function is expressed as:
In formula,WithIt is diagonal matrix, and
The optimization problem of formula (16) and (17) is converted into:
By method of Lagrange multipliers, obtain:
XMbXTV=λ XMwXTV (19)
By asking for the characteristic value of formula (19), and descending arrangement, the corresponding characteristic vector composition of preceding d characteristic value is taken Projection matrix V=[v1,v2,…,vd]。
The present invention characterizes the immanent structure of high-spectral data using reconstruction point in the neighborhood of data and the class of each neighborhood, i.e., When scheming between figure and class in structure class, the neighborhood relationships of data are not only allowed for, and consider data neighbour's neighborhood of a point knot Structure, can obtain more implicit informations from high-spectral data, thereby enhance in class data between the aggregation and class of data Separation property, and then the otherness between non-homogeneous data is highlighted, the inherent manifold structure of high-spectral data can be preferably characterized, from And diagnostic characteristics is extracted, improve nicety of grading.
Therefore, hyperspectral image classification method proposed by the present invention, can more effectively extract diagnostic characteristics, and classification Result is more accurate, and the terrain classification effect to high spectrum image is more preferable.Contrast and experiment also indicates that this method is existing compared with other Various methods all have a clear superiority.
Brief description of the drawings
Fig. 1-MFA algorithm principle figures.
Fig. 2-classification process schematic diagram of the present invention.
Fig. 3-patterning process schematic diagram of the present invention.
Overall classification accuracy schematic diagrames of Fig. 4-different parameters k and β on Salinas data sets.
Fig. 5-classification chart of each algorithm to SVMCK on Salinas data sets.Wherein, (a) GT, (b) Baseline (95.8%), (c) PCA (94.8%), (d) NPE (96.4%), (e) LPP (96.0%), (f) LDA (94.6%), (g) LFDA (96.3%), (h) MMC (94.7%), (i) MFA (95.5%), (j) LME (99.2%).
Nicety of grading schematic diagrames of Fig. 6-different parameters k and β on Indian Pines data sets.
Classification chart of Fig. 7-each algorithm to SVMCK on Indian Pines data sets.Wherein, (a) GT, (b) Baseline (93.9%), (c) PCA (91.8%), (d) NPE (92.0%), (e) LPP (91.0%), (f) LDA (95.0%), (g) LFDA (90.9%), (h) MMC (89.9%), (i) MFA (93.5%), (j) LME (98.1%).
The embedded result figure of the two dimension of five class atural objects in Fig. 8-IndianPines data sets.Wherein, (a) Spectral curve,(b)PCA,(c)NPE,(d)LPP,(e)LDA,(f)LFDA,(g)MMC,(h)MFA,(i)LME。
Specific embodiment
It can be seen from process according to MFA algorithms, its neighbour structure that data are only considered in composition is a large amount of same for existing For the high spectrum image in matter region, MFA can not effectively characterize data inherent manifold.To improve MFA algorithms in EO-1 hyperion Effect in image characteristics extraction, the present invention proposes a kind of new manifold learning, referred to as local manifolds insertion (LME).
The present invention characterizes the immanent structure of high spectrum image using the neighborhood and each neighborhood neighborhood of a point of data.First, Each data point is reconstructed using similar Neighbor Points, it is then, corresponding heavy using each data neighborhood of a point and each neighborhood point Structure point schemes the reconstruct image and between class to build between reconstruct image, class in figure, class in class, finally, in low-dimensional embedded space, keep in class The structure of figure is constant, and the structural relation of figure, obtains the projection matrix from higher dimensional space to lower dimensional space, so as to extract between suppression class Go out diagnostic characteristics.Fig. 2 is classification process schematic diagram of the invention.
To each data point of training sample xi, k is chosen from from similar data1Individual Neighbor Points reconstruct xi, reconstruction point For
In formula, sijIt is data point xiWith xjBetween reconstruct weights, andsi=[si1,si2,…,siN ]T;If xiWith xjIt is similar neighbour, sij≠ 0, otherwise sij=0, it is defined as:
In formula, normalized value
Build figure G in classw={ X, Ww, X is the summit of figure, if two summit x in figureiAnd xjBelong to the k from homogeneous data1 Neighbour, then in xiAnd xjBetween build one connection side, otherwise, xiAnd xjBetween it is boundless, the weights on side areRepresent xiAnd xjBetween Similarity relation, be defined as:
In formula,Parameter
Build reconstruct image in class It is the summit of figure, if xiAnd xjBelong to the k from homogeneous data1Closely Neighbour, then correspondingWithBetween build one connection side, otherwise,WithBetween it is boundless, the weights on side areRepresentWithBetween similarity relation, be defined as:
In formula,Parameter
Scheme G between building classb={ X, Wb, X is the summit of figure, if two summit x in figureiAnd xjBelong to the k of non-homogeneous data2Closely Neighbour, then in xiAnd xjBetween build one connection side, otherwise, xiAnd xjIt is not connected to, the weights on side areRepresent xiAnd xjBetween it is near Like degree, it is defined as:
In formula,Parameter
Build reconstruct image between class It is the summit of figure, if xiAnd xjBelong to the k of non-homogeneous data2Neighbour, Then correspondingWithBetween build one connection side, otherwise,WithIt is not connected to, the weights on side areRepresentWithBetween Degree of approximation, be defined as:
In formula,Parameter
Fig. 3 builds schematic diagram for the figure of LME algorithms.When scheming in structure class, for data point x1, not only allow for its Neighborhood point (such as x2), and consider reconstruction point (such as x in the class of each neighborhood point1Class in neighborhood reconstruction point x7, x2Class in it is adjacent Domain reconstruction point x3), i.e., in x1With x2Between build on one side, also in x3And x7Between connect a line, side right value all according to formula (12) and (13) it is configured.When scheming between structure class, it is contemplated that each data point (such as x4) with its non-similar neighborhood (such as x5) and each neighborhood point Class in reconstruction point (such as x4Class in neighborhood reconstruction point x8, x5Class in neighborhood reconstruction point x6) between relation, that is, connect x4With x5, while also connecting x6With x8, side right value is all configured by formula (14) and (15).
For data in class, do not change figure and the similarity relation of reconstruct image in class in class in lower dimensional space, make like numbers According to and its reconstruct data flock together as far as possible, and then reduce the difference between homogeneous data, object function is expressed as:
In formula,WithIt is diagonal matrix, andS=[s1,s2,…,sN], si=[si1,si2,…,siN]T
For data between class, the similitude between suppression class in lower dimensional space between figure and class in reconstruct image between data is separated Non- homogeneous data is opened, and then increases the difference between non-homogeneous data, object function is expressed as:
In formula,WithIt is diagonal matrix, and
The optimization problem of formula (16) and (17) is converted into:
By method of Lagrange multipliers, obtain:
XMbXTV=λ XMwXTV (19)
By asking for the characteristic value of formula (19), and descending arrangement, the corresponding characteristic vector composition of preceding d characteristic value is taken Projection matrix V=[v1,v2,…,vd]。
By the projection matrix for obtaining, you can by the high dimensional data dimensionality reduction of training sample, the low-dimensional for obtaining training sample is embedding Enter feature;To need the high spectrum image of classification as test sample simultaneously, the high dimensional data dimensionality reduction of test sample is surveyed Sample this low-dimensional insertion;In conjunction with the grader chosen, you can the low-dimensional insertion to test sample is classified, and obtains EO-1 hyperion The classification results of image.
The present invention is built in class by the neighborhood point of data and similar neighbour's reconstruction point of each neighborhood and schemed between figure and class, energy The interior separability accumulate attribute, improve data of high spectrum image is preferably characterized, and then lifts nicety of grading.It is similar near to reduce Adjacent k1With non-similar neighbour k2The difficulty of setting, is set to non-similar neighbour the integral multiple of similar neighbour, because non-homogeneous data It is generally more than homogeneous data, and the small range of non-same neighbour to change influence to classification results smaller.
To analyze feasibility of the invention, have chosen Salinas and Indian Pines high-spectral data collection and tested, And contrasted with existing related algorithm.
In experiment, data set is randomly divided into training set and test set, by feature extraction algorithm to training set Practise, obtain a low-dimensional embedded space;Then, all of test sample is mapped in this lower dimensional space, obtains each sample Low-dimensional feature;Finally, using arest neighbors (NN) grader, spectral modeling drawing (SAM) and complex nucleus SVMs (SVM Based on Composite Kernels, SVMCK) test sample is classified.To evaluate the classification results of each method, adopt With average nicety of grading (AA), overall classification accuracy (OA) and Kappa coefficients (KC) as evaluation index.It is Enhancement test knot 10 repetitions have been carried out under the robustness of fruit, each case to test, and have calculated the average value and standard deviation of each precision.
In experiment, have selected Baseline (BL), PCA, LDA, LFDA, MMC and MFA algorithm and inventive algorithm carry out it is right Than wherein Baseline is represented and test sample is classified using grader directly.To make each algorithm obtain optimal classification knot Really, the parameter of each algorithm is obtained using cross-validation method, the neighbour's number for LPP, NPE and LFDA is set to 9.Due between class Classification results influence of the change on MFA and LME is smaller on a large scale for neighbour's number, thus, neighbour k in class can be set1Between=k, class Neighbour k2=β k, wherein β are positive integer;In experiment, the k and β value of MFA and LME are respectively set to 9 and 20.Classify for SVMCK Device, using the Weighted Kernel constituted based on RBF kernel functions, because the core has more preferable classification results, spatial information than other cores It is to be represented by the average pixel value in spatial neighborhood;In experiment, the nuclear parameter δ of punishment parameter C and RBF is by { 10-10, 10-9,…,1010In the range of grid optimizing obtain, spatial neighborhood window size is set to 9 × 9.The low-dimensional insertion of LDA algorithm Dimension is set to c-1, and wherein c is classification number, and the low-dimensional insertion dimension of other algorithms is set to 30.
Experiment on Salinas data sets
It is influence of neighbour's number to nicety of grading between neighbour's number and class in analysis inhomogeneity, in experiment, from Salinas data Collection every class atural object in randomly select 60 data as training sample, remaining data as test sample, by feature extraction Afterwards, test sample is classified using NN.In experiment, the scope of parameter k and β be respectively set to { 3,5,7 ..., 25 } and 5, 15,20 ..., 60 }, 10 repetitions have been carried out under the conditions of every kind of to test, Fig. 4 is average totality of the inventive algorithm to parameter k and β Nicety of grading.
As shown in Figure 4, with the increase of k values, nicety of grading first increases and reduces afterwards, because too small or excessive k values All can not effectively express the immanent structure of high spectrum image.When k values are less than 15, with the increase of β, nicety of grading constantly increases Plus, it is finally reached a stationary value;When k values are more than 15, if β value is more than 20, nicety of grading will fall rapidly upon, due to excessive k And β value will make class between border there is study phenomenon.On the whole, the change of k and β value is to Salinas data set niceties of grading Influence it is little;In experiment, the optimum value of k and β is set to 9 and 20.
Nicety of grading (OA ± std (%) (KC)) of the algorithms of different of table 1 to different classifications device on Salinas data sets
To analyze classification performance of each algorithm for different classifications device under varying number training sample, from every class atural object Randomly select ni, used as training sample, remaining data is used as test sample for individual data.By feature extraction algorithm to training sample Learnt, after obtaining the low-dimensional feature of each sample, test sample is classified using NN, SAM and SVMCK, and it is every kind of In the case of all carried out 10 times repetition test.Table 1 is 10 the average overall classification accuracies of experiment, standard deviation and average Kappa Coefficient.
It can be seen from table 1, the overall classification accuracy and Kappa coefficients of each algorithm are all with the increase of training sample data It is continuously increased, because number of training is more, available prior information is also more, the intrinsic characteristic expression to high spectrum image It is more accurate.Under different classifications device, classification results of each algorithm for SVMCK all compare the good of other graders, because SVMCK The spectral information and spatial information of data is make use of simultaneously to strengthen the classification performance of high spectrum image.In each condition, this hair Bright algorithm is all more preferable than the classification results of MFA algorithm, because this algorithm make use of weight in the neighborhood of data and the class of each neighborhood point Structure point characterizes the inherent manifold structure of high-spectral data, enhances the separation property of data between the aggregation and class of data in class, More preferable diagnostic characteristics is obtained, and then improves terrain classification precision.
To analyze nicety of grading of this algorithm under balance training sample number, the data of selection 2% are used as instruction from every class Practice sample, remaining data evaluated producer's precision of each algorithm as test specimens originally.After obtaining the embedded feature of each sample, profit Test sample is classified with SVMCK, table 2 gives the nicety of grading of each algorithm, Fig. 5 is corresponding classification results figure.
As shown in Table 2, LME algorithms all have best classification results in most of atural object, and it is total to possess highest Body nicety of grading, average nicety of grading and Kappa coefficients, show that LME algorithms can more effectively disclose the inherence of high spectrum image Manifold structure, extracts more preferable diagnostic characteristics, and then lift nicety of grading.
Classification results of each algorithm of table 2 to SVMCK on Salinas data sets
In Figure 5, LME algorithms generate more homogeneous regions than other algorithms, more identical with true atural object, especially " Grapes ", " Corn, Lettuce 4wk ", " Lettuce 7wk ", " Vinyard untrained " in object area.
It is the classification performance of relatively more each algorithm, table 3 illustrates the McNemar test values between each algorithm.Can be with from table Find out, LME algorithms have the more significant statistical difference opposite sex than other algorithms, illustrate that the algorithm can preferably extract the mirror of atural object Other feature, improves nicety of grading.
McNemar inspection of each algorithm of table 3 on Salinas data sets
Experiment on Indian Pines data sets
It is to analyze LME algorithms to the classification performance of different atural object scenes, has selected Indian Pines data sets to enter in addition Row experiment.In an experiment, n is randomly choosed from every class atural objecti, used as training sample, remainder data is used as test specimens for individual data This, the atural object less for sample number, if ni≥Ni/ 2, then ni=Ni/ 2, wherein NiIt is the sample number of the i-th class atural object, such as: “Alfalfa”、“Grass/Pasture-mowed”、“Oats”。
To explore influences of the parameter k and β to classification results, 60 data are randomly selected from every class atural object as training sample This, remainder data is used as test sample.After each algorithm learns to training sample, the low-dimensional feature of each sample, and profit are obtained The low-dimensional feature of test sample is classified with NN.Fig. 6 gives the nicety of grading under different parameters k and β.
It is it will be appreciated from fig. 6 that as the increase of k values, nicety of grading first rise and decline afterwards, the reason for cause this phenomenon:Compared with Small k values can not obtain enough information and go to characterize the immanent structure of high spectrum image, and excessive k values are characterizing EO-1 hyperion number According to intrinsic characteristic when over-fitting occurs.With being continuously increased for β value, nicety of grading is also continuously increased therewith, and reaches To the peak value of a stabilization.In experiment, to obtain optimal classification result, k and β is respectively set to 9 and 20.
To analyze each algorithm to the classification results of different classifications device, 20,40,60 and 80 are randomly selected from every class atural object , used as training sample, remainder data is used as test sample for individual data.Carry out 10 repetitions under each case to test, table 4 is given The average overall classification accuracy of each method, standard deviation and average Kappa coefficients.
As can be seen from Table 4, the nicety of grading of each algorithm is constantly lifted with the increase of number of training.Each algorithm for The classification results of NN and SAM are all undesirable, because NN and SAM poor to the distinguishing ability of each atural object in Indian Pines, makes Nicety of grading is restricted.But, LME is good still than other method to the classification results of NN and SAM, because LME algorithms can have The inherent manifold structure of high-spectral data is disclosed to effect, more preferable diagnostic characteristics is obtained, nicety of grading is lifted.In addition, each calculate Method all compares the good of NN and SAM to the nicety of grading of SVMCK, and LME has best dividing to SVMCK under all conditions Class precision, because SVMCK make use of the spectral information and spatial information of high spectrum image to classify simultaneously.
Nicety of grading (OA ± std (%) of each algorithm of table 4 to different classifications device on Indian Pines data sets (KC))
Be the nicety of grading of all kinds of atural objects of analysis, per class atural object in randomly select 10% data as training sample, it is right In the less atural object of sample number, 10 data are randomly selected as training sample per class, remaining data is used as test sample. To after low-dimensional feature, test sample is classified using SVMCK graders, table 5 gives the classification results of each algorithm.
Classification results of each algorithm of table 5 to SVMCK on Indian Pines data sets
It can be seen from classification results according to table 5, LME algorithms have obtained more preferable classification in most of atural objects than other algorithms Precision, and with best average nicety of grading, overall classification accuracy and Kappa coefficients, because LME algorithms can effectively be taken off Hiding information in high-spectral data is shown.Fig. 7 is the corresponding classification chart of each algorithm, and LME algorithms are generated more than other algorithms Smooth classification results, whole structure is closer with true atural object scene.
It is the classification performance between relatively more each algorithm, carrying out statistical difference opposite sex inspection between each algorithm classification result, table 6 is given The McNemar assays gone out between each algorithm.As can be seen from the table, McNemar test value of each algorithm to LME algorithms All it is negative, illustrates that LME algorithms have the more significant statistical difference opposite sex, can preferably extracts the internal characteristicses of high-spectral data, Lifting terrain classification precision.
McNemar inspection of each algorithm of table 6 on Indian Pines data sets
The embedded analysis of two dimension
Be the low-dimensional insertion feature of each algorithm of analysis, chosen from Indian Pines data sets " Corn-mintill ", " Grass-trees ", " Hay-windrowed ", " Wheat " and " Woods " five class atural object carries out two-dimentional insertion, and from this several class 100 data are randomly selected as training sample per class in atural object, and remaining sample is used as test specimens.By each algorithm to training sample Originally learnt, and then the two dimension insertion of test sample can be obtained, and the distribution of each data point is drawn out in two-dimensional space Situation.Fig. 8 gives each algorithm two dimension embedded distribution results, and five class atural objects are represented with 1,2,3,4,5 respectively in figure.
As shown in Figure 8, the embedded result of the two dimension of PCA, NPE and LPP can be seen that similar atural object distribution is very at random and non- Overlapping phenomenon is generated between similar atural object, because they belong to non_monitor algorithm, it is impossible to which effectively the inherence of characterize data is special Property.The LDA and MMC of supervision to be improved and still suffer from overlapping cases between the aggregation of homogeneous data, but non-homogeneous data, because they All originate from statistical theory, have ignored the immanent structure of data.MFA can disclose the inherent manifold of data, but can not be effectively The manifold structure of high-spectral data is expressed, non-homogeneous data produces the phenomenon for overlapping distribution in causing Fig. 8 (h).The two of LME algorithms Dimension insertion is all better than other algorithms, because LME make use of in the neighbour structure of data and the class of each neighborhood reconstruction point to characterize height The immanent structure of spectroscopic data, enhances the separation property of data between the aggregation and class of data in class.
By classification experiments of the LME algorithms on two high-spectral data collection of Salinas and Indian Pines, can obtain To draw a conclusion:
1. LME classifying qualities in all cases all than Baseline, PCA, NPE, LPP, LDA, LFDA, MMC and MFA It is good, because LME characterizes the immanent structure of high-spectral data using reconstruction point in the neighborhood of data and the class of each neighborhood, enhance In class between the aggregation and class of data data separation property.Show that LME can obtain more implicit informations from high-spectral data.
2. under different classifications device, LME has more preferable nicety of grading than other algorithms, because LME can be obtained more effectively Diagnostic characteristics, and adaptability is stronger, and then can simultaneously improve the nicety of grading of NN, SAM and SVMCK.
3. the classification results on Salinas and Indian Pines data sets understand, SVMCK in all cases point Class precision all than NN and SAM more preferably because SVMCK make use of the spectral information and spatial information of high spectrum image to divide simultaneously Class, and NN and SAM only make use of spectral information to be classified.
4. from LME algorithms time complexity analyze, run time depend primarily on the dimension of data, neighbour's number and Number of training.It can be seen from run time, LME is more time-consuming than other algorithms in characteristic extraction procedure under the same terms, this It is because LME needs to expend the more time when building and scheming between figure and class in class.But LME can be reduced directly with grader classification Time, and nicety of grading can be lifted.
5. in the embedded experiment of two dimension of data, LME has obtained more preferable distribution than other algorithms, shows that LME algorithms change It has been apt to the separation property of data between the aggregation and class of data in class, and then has highlighted the otherness between non-homogeneous data.
The problem of high spectrum image immanent structure can not be effectively characterized for MFA algorithms, the present invention proposes LME calculations Method, this algorithm discloses the manifold structure of high-spectral data, and structure using reconstruction point in the neighborhood point of data and the class of each neighborhood Build in class and schemed between figure and class, kept the structure of figure constant in low-dimensional embedded space, the aggregation and class of data in enhancing class Between data separation property, obtain diagnostic characteristics, realize the classification of high spectrum image.In Salinas and Indian Pines data It is on collection test result indicate that, this algorithm can obtain more preferable diagnostic characteristics than other feature extraction algorithms, and then improve bloom The terrain classification precision of spectrogram picture.
It is last it should be noted that examples detailed above of the invention is only example to illustrate the invention, and not It is the restriction to embodiments of the present invention.It is right although applicant has been described in detail with reference to preferred embodiment to the present invention For those of ordinary skill in the art, can also make on the basis of the above description other multi-forms change and Change.Here all of implementation method cannot be exhaustive.Every belong to that technical scheme amplifies out aobvious and The change that is clear to changes row still in protection scope of the present invention.

Claims (5)

1. the hyperspectral image classification method being embedded in based on local manifolds, it is characterised in that comprise the following steps:
1) the fixed training sample X=[x of selection class label1,x2,…,xN], liIt is xiClass label, to training sample Each data point is reconstructed using similar Neighbor Points;
2) built using each data neighborhood of a point and the corresponding reconstruction point of each neighborhood point in class in figure, class between reconstruct image, class Reconstruct image between figure and class;
3) in low-dimensional embedded space, keep figure in class constant with the structure of reconstruct image in class, reconstructed between figure and class between suppressing class The structural relation of figure, obtains the projection matrix from higher dimensional space to lower dimensional space;
4) by step 3) projection matrix that obtains, by the high dimensional data dimensionality reduction of training sample, the low-dimensional for obtaining training sample is embedding Enter feature;
5) by step 3) projection matrix that obtains, using high spectrum image to be sorted as test sample and by test sample High dimensional data dimensionality reduction, obtains the insertion of test sample low-dimensional;
6) according to step 4) the low-dimensional insertion feature of the training sample that obtains and combine the grader of selection, you can to test sample Low-dimensional insertion classified, obtain classification hyperspectral imagery result.
2. it is according to claim 1 based on local manifolds insertion hyperspectral image classification method, it is characterised in that:Step 1) it is using the method that similar Neighbor Points are reconstructed to each data point of training sample,
To each data point of training sample xi, k is chosen from from similar data1Individual Neighbor Points reconstruct xi, reconstruction point is
In formula, sijIt is data point xiWith xjBetween reconstruct weights, andsi=[si1,si2,…,siN]T
If xiWith xjIt is similar neighbour, sij≠ 0, otherwise sij=0, it is defined as:
In formula, normalized value
3. it is according to claim 1 based on local manifolds insertion hyperspectral image classification method, it is characterised in that:Step 2) structure of reconstruct image is as follows between figure and class between reconstruct image, class in figure, class in class in,
Build figure G in classw={ X, Ww, X is the summit of figure, if two summit x in figureiAnd xjBelong to the k from homogeneous data1Closely Neighbour, then in xiAnd xjBetween build one connection side, otherwise, xiAnd xjBetween it is boundless, the weights on side areRepresent xiAnd xjBetween Similarity relation, be defined as:
In formula,Parameter
Build reconstruct image in class It is the summit of figure, if xiAnd xjBelong to the k from homogeneous data1Neighbour, then CorrespondingWithBetween build one connection side, otherwise,WithBetween it is boundless, the weights on side areRepresentWithBetween Similarity relation, be defined as:
In formula,Parameter
Scheme G between building classb={ X, Wb, X is the summit of figure, if two summit x in figureiAnd xjBelong to the k of non-homogeneous data2Neighbour, Then in xiAnd xjBetween build one connection side, otherwise, xiAnd xjIt is not connected to, the weights on side areRepresent xiAnd xjBetween it is approximate Degree, is defined as:
In formula,Parameter
Build reconstruct image between class It is the summit of figure, if xiAnd xjBelong to the k of non-homogeneous data2Neighbour, then exist It is correspondingWithBetween build one connection side, otherwise,WithIt is not connected to, the weights on side areRepresentWithBetween it is near Like degree, it is defined as:
In formula,Parameter
4. it is according to claim 1 based on local manifolds insertion hyperspectral image classification method, it is characterised in that:Step 3) projection matrix determines as follows in,
For data in class, do not change the similarity relation of reconstruct image in figure and class in class in lower dimensional space, make homogeneous data and Its reconstruct data flocks together as far as possible, and then reduces the difference between homogeneous data, and object function is expressed as:
In formula,WithIt is diagonal matrix, andS=[s1,s2,…,sN], si=[si1,si2,…,siN]T
For data between class, the similitude between suppression class in lower dimensional space between figure and class in reconstruct image between data is separated non- Homogeneous data, and then increase the difference between non-homogeneous data, object function is expressed as:
In formula,WithIt is diagonal matrix, and
The optimization problem of formula (16) and (17) is converted into:
m a x V T XM b X T V V T XM w X T V - - - ( 18 )
By method of Lagrange multipliers, obtain:
XMbXTV=λ XMwXTV (19)
By asking for the characteristic value of formula (19), and descending arrangement, the corresponding characteristic vector composition projection of preceding d characteristic value is taken Matrix V=[v1,v2,…,vd]。
5. it is according to claim 3 based on local manifolds insertion hyperspectral image classification method, it is characterised in that:In structure When building figure, non-similar neighbour is set to the integral multiple of similar neighbour, i.e. k2It is k1Integral multiple.
CN201611219213.9A 2016-12-26 2016-12-26 Hyperspectral image classification method based on local manifolds insertion Pending CN106778885A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611219213.9A CN106778885A (en) 2016-12-26 2016-12-26 Hyperspectral image classification method based on local manifolds insertion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611219213.9A CN106778885A (en) 2016-12-26 2016-12-26 Hyperspectral image classification method based on local manifolds insertion

Publications (1)

Publication Number Publication Date
CN106778885A true CN106778885A (en) 2017-05-31

Family

ID=58926336

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611219213.9A Pending CN106778885A (en) 2016-12-26 2016-12-26 Hyperspectral image classification method based on local manifolds insertion

Country Status (1)

Country Link
CN (1) CN106778885A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107609580A (en) * 2017-08-29 2018-01-19 天津大学 A kind of low-rank tensor identification analysis method of direct-push
CN108520281A (en) * 2018-04-13 2018-09-11 上海海洋大学 A kind of semi-supervised dimension reduction method of high spectrum image kept based on overall situation and partial situation
CN110070485A (en) * 2019-04-04 2019-07-30 南京信息工程大学 A kind of high-spectrum image dimensionality reduction method
CN110619370A (en) * 2019-09-23 2019-12-27 云南电网有限责任公司电力科学研究院 Hyperspectral image super-pixel local linear embedding dimension reduction method
CN110852304A (en) * 2019-11-22 2020-02-28 重庆大学 Hyperspectral data processing method based on deep learning method
CN111783865A (en) * 2020-06-23 2020-10-16 西北工业大学 Hyperspectral classification method based on space spectrum neighborhood embedding and optimal similarity graph
CN112926658A (en) * 2021-02-26 2021-06-08 西安交通大学 Image clustering method and device based on two-dimensional data embedding and adjacent topological graph
CN114898193A (en) * 2022-07-11 2022-08-12 之江实验室 Manifold learning-based image feature fusion method and device and image classification system
CN115861683A (en) * 2022-11-16 2023-03-28 西安科技大学 Rapid dimensionality reduction method for hyperspectral image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103729652A (en) * 2014-01-17 2014-04-16 重庆大学 Sparsity preserving manifold embedding based hyperspectral remote sensing image classification method
CN104408466A (en) * 2014-11-17 2015-03-11 中国地质大学(武汉) Semi-supervision and classification method for hyper-spectral remote sensing images based on local stream type learning composition
CN106126474A (en) * 2016-04-13 2016-11-16 扬州大学 A kind of linear classification method embedded based on local spline

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103729652A (en) * 2014-01-17 2014-04-16 重庆大学 Sparsity preserving manifold embedding based hyperspectral remote sensing image classification method
CN104408466A (en) * 2014-11-17 2015-03-11 中国地质大学(武汉) Semi-supervision and classification method for hyper-spectral remote sensing images based on local stream type learning composition
CN106126474A (en) * 2016-04-13 2016-11-16 扬州大学 A kind of linear classification method embedded based on local spline

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
FULIN LUO 等: ""Dimensionality reduction of hyperspectral images with local geometric structure Fisher analysis"", 《2016 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS)》 *
罗甫林: ""高光谱图像稀疏流形学习方法研究"", 《中国博士学位论文全文数据库 信息科技辑》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107609580B (en) * 2017-08-29 2021-02-02 天津大学 Direct-push type low-rank tensor discriminability analysis method
CN107609580A (en) * 2017-08-29 2018-01-19 天津大学 A kind of low-rank tensor identification analysis method of direct-push
CN108520281A (en) * 2018-04-13 2018-09-11 上海海洋大学 A kind of semi-supervised dimension reduction method of high spectrum image kept based on overall situation and partial situation
CN110070485A (en) * 2019-04-04 2019-07-30 南京信息工程大学 A kind of high-spectrum image dimensionality reduction method
CN110619370A (en) * 2019-09-23 2019-12-27 云南电网有限责任公司电力科学研究院 Hyperspectral image super-pixel local linear embedding dimension reduction method
CN110852304A (en) * 2019-11-22 2020-02-28 重庆大学 Hyperspectral data processing method based on deep learning method
CN110852304B (en) * 2019-11-22 2022-03-22 重庆大学 Hyperspectral data processing method based on deep learning method
CN111783865A (en) * 2020-06-23 2020-10-16 西北工业大学 Hyperspectral classification method based on space spectrum neighborhood embedding and optimal similarity graph
CN112926658A (en) * 2021-02-26 2021-06-08 西安交通大学 Image clustering method and device based on two-dimensional data embedding and adjacent topological graph
CN112926658B (en) * 2021-02-26 2023-03-21 西安交通大学 Image clustering method and device based on two-dimensional data embedding and adjacent topological graph
CN114898193A (en) * 2022-07-11 2022-08-12 之江实验室 Manifold learning-based image feature fusion method and device and image classification system
CN115861683A (en) * 2022-11-16 2023-03-28 西安科技大学 Rapid dimensionality reduction method for hyperspectral image
CN115861683B (en) * 2022-11-16 2024-01-16 西安科技大学 Rapid dimension reduction method for hyperspectral image

Similar Documents

Publication Publication Date Title
CN106778885A (en) Hyperspectral image classification method based on local manifolds insertion
CN110399909B (en) Hyperspectral image classification method based on label constraint elastic network graph model
CN112232280B (en) Hyperspectral image classification method based on self-encoder and 3D depth residual error network
CN108491849B (en) Hyperspectral image classification method based on three-dimensional dense connection convolution neural network
CN107316013B (en) Hyperspectral image classification method based on NSCT (non-subsampled Contourlet transform) and DCNN (data-to-neural network)
CN104751191B (en) A kind of Hyperspectral Image Classification method of sparse adaptive semi-supervised multiple manifold study
CN107451614B (en) Hyperspectral classification method based on fusion of space coordinates and space spectrum features
CN109389080B (en) Hyperspectral image classification method based on semi-supervised WGAN-GP
CN107563442B (en) Hyperspectral image classification method based on sparse low-rank regular graph tensor embedding
CN109492593B (en) Hyperspectral image classification method based on principal component analysis network and space coordinates
Kumari et al. Hybridized approach of image segmentation in classification of fruit mango using BPNN and discriminant analyzer
CN108427913B (en) Hyperspectral image classification method combining spectral, spatial and hierarchical structure information
CN105913092B (en) Figure canonical hyperspectral image band selection method based on sub-space learning
CN109190511B (en) Hyperspectral classification method based on local and structural constraint low-rank representation
CN108446582A (en) Hyperspectral image classification method based on textural characteristics and affine propagation clustering algorithm
CN110533077A (en) Form adaptive convolution deep neural network method for classification hyperspectral imagery
CN105069478A (en) Hyperspectral remote sensing surface feature classification method based on superpixel-tensor sparse coding
CN104268556A (en) Hyperspectral image classification method based on nuclear low-rank representing graph and spatial constraint
CN110147725A (en) A kind of high spectrum image feature extracting method for protecting projection based on orthogonal index office
CN115564996A (en) Hyperspectral remote sensing image classification method based on attention union network
Tu et al. Hyperspectral image classification using a superpixel–pixel–subpixel multilevel network
Meng et al. Deep learning for fine-grained classification of jujube fruit in the natural environment
CN105869161B (en) Hyperspectral image band selection method based on image quality evaluation
Zhao et al. Hyperspectral image classification based on graph transformer network and graph attention mechanism
Liu et al. DS-MENet for the classification of citrus disease

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170531