CN103729651A - Hyperspectral remote sensing image classification method based on manifold neighbor measurement through local spectral angles - Google Patents

Hyperspectral remote sensing image classification method based on manifold neighbor measurement through local spectral angles Download PDF

Info

Publication number
CN103729651A
CN103729651A CN201410023922.4A CN201410023922A CN103729651A CN 103729651 A CN103729651 A CN 103729651A CN 201410023922 A CN201410023922 A CN 201410023922A CN 103729651 A CN103729651 A CN 103729651A
Authority
CN
China
Prior art keywords
neighbour
data
classification
point
lle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410023922.4A
Other languages
Chinese (zh)
Inventor
刘嘉敏
罗甫林
黄鸿
李连泽
刘军委
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University
Original Assignee
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University filed Critical Chongqing University
Priority to CN201410023922.4A priority Critical patent/CN103729651A/en
Publication of CN103729651A publication Critical patent/CN103729651A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

The invention discloses a hyperspectral remote sensing image classification method based on manifold neighbor measurement through local spectral angles. A wide range of neighbors are obtained through the traditional Euclidean distance, accurate neighbors are obtained through spectral angles, the local reconstruction is performed through the neighbors and the reconstruction error is minimized, the local reconstruction mode maintains unchanged in the low dimensional space, the reconstruction error is minimized, and accordingly internal identification characteristics in high dimensional data can be extracted. During classification, neighbors of a new sample are obtained through the traditional Euclidean distance, spectral angles between the new sample and the neighbors are calculated, and the new sample is classified as the class with the smallest spectral angles. According to the hyperspectral remote sensing image classification method based on the manifold neighbor measurement through the local spectral angles, the identification characteristics can be effectively extracted, the classification result is accurate, and the feature classification effect on a hyperspectral remote sensing image is good.

Description

Based on local light spectral corner stream of measurements shape neighbour's Hyperspectral Remote Sensing Imagery Classification method
Technical field
The present invention relates to the improvement of target in hyperspectral remotely sensed image feature extraction and sorting technique, be specifically related to a kind of Hyperspectral Remote Sensing Imagery Classification method based on local light spectral corner stream of measurements shape neighbour, belong to target in hyperspectral remotely sensed image feature extraction and sorting technique field.
Background technology
Scientific researchers have proposed high-spectrum remote-sensing on the basis of early 1980s at multispectral remote sensing.The spectral resolution of target in hyperspectral remotely sensed image is up to 10 -2the λ order of magnitude (belonging to nanoscale), wavelength band is from visible ray to short-wave infrared, spectral band number reaches dozens of more than even hundreds of, the feature that high spectral resolution is high makes the interval between high spectrum image data adjacent band narrower, there is wave band overlapping region, spectrum channel is no longer discrete and present continuously, therefore the high-spectrum remote-sensing Imaging Spectral Remote Sensing that is conventionally otherwise known as.High-spectrum remote-sensing not only can solve the identification problem of the large class of atural object, and can carry out segmentation or meticulous Spectra feature extraction in class.First Hyperspectral Remote Sensing Imagery Classification needs testing data to carry out feature extraction, carries out Dimensionality Reduction, and then the feature extracting is classified.
One, feature extracting method introduction
Target in hyperspectral remotely sensed image is obtained by imaging spectrometer, contains abundant information, and to atural object, research has brought new opportunity.But because target in hyperspectral remotely sensed image data volume is large, between data, correlativity is strong, redundance is large, dimension is high, information is implicit, and traditional classification method very easily causes Hughes phenomenon, i.e. " dimension disaster ".Therefore, how from high dimensional data, effectively to extract hidden feature, reduce data dimension and become the emphasis that target in hyperspectral remotely sensed image is studied aspect data processing.
1, traditional characteristic extracting method
In traditional Study of Feature Extracting Method, scholars have proposed the method for several classics, mainly contain: principal component analysis (PCA) (Principal Component Analysis, PCA), linear discriminant analysis (Linear Discriminant Analysis, LDA), Multidimensional Scaling (Multidimensional Scaling, MDS), minimal noise separates conversion (Minimum Noise Fraction, MNF), independent component analysis (Independent Component Analysis, ICA) etc., these methods are all to belong to overall linear structure as prerequisite take data, under a certain Optimality Criteria condition, find the immediate linear model of data, they are all linear dimension reduction method.The advantages such as these method physical models are clear and definite, easily explain, process is simple and expansion is good, usable range is wide, in the research of data characteristics extraction, has accounted for leading position.But the structure of high-spectrum remote sensing data is very complicated, be not linear distribution, use traditional Dimensionality Reduction method can not to obtain good effect.
2, manifold learning
Manifold learning main thought is by sample data is learnt, from the geometry of sample data, find implicit information, find out the essential characteristic of each data, obtain low-dimensional implicit in high dimensional data and embed feature or data-mapping relation, realize the Dimensionality Reduction of data from higher dimensional space to lower dimensional space or the two-dimentional three-dimensional visualization demonstration of data.High-spectrum remote sensing data is nonlinear Distribution, by manifold learning, can disclose well implicit low dimensional manifold structure, effectively extracts the diagnostic characteristics of each data, obtains better result.Main manifold learning has local linear to embed algorithm (LLE), equidistant mapping algorithm (ISOMAP), laplacian eigenmaps algorithm (LE), neighborhood maintenance embedding algorithm (NPE) and the local projection algorithm (LPP) that keeps.
2.1) main thought of LLE algorithm is that nonlinear organization data present linear structure at subrange, keep the linear structure relation of this part, from high dimensional data, disclose low dimensional manifold structure, and then extract the stream shape feature of each data, realize Dimensionality Reduction.Its main method is each k Neighbor Points linear expression for data point in higher dimensional space, the brief rear maintenance of dimension is constant with the weights of each corresponding Neighbor Points, the data point of the brief rear correspondence of reconstruct dimension, makes reconstructed error minimum, can effectively extract the stream shape feature of nonlinear organization data.LLE algorithm is by the Partial Reconstruction to data, in lower dimensional space, keep Partial Reconstruction relation constant, extract low dimensional manifold feature implicit in data, disclose the nonlinear organization relation in data, to high-spectrum remote sensing data, can effectively show the inherent geometry of data.But the neighbour of LLE algorithm chooses, larger on result impact.
2.2) ISOMAP carrys out the geometric relationship between approximate data by geodesic distance, from higher dimensional space, project to lower dimensional space, keep the geodesic distance between each data point constant, i.e. also apart from each other after the spot projection of geometric position apart from each other in higher dimensional space, geometric position after nearer spot projection also at a distance of nearer.ISOMAP algorithm has proposed the concept of geodesic distance, and in higher dimensional space, geodesic distance more can truly reflect the geometric relationship between data point, can extract better inherent low dimensional manifold in nonlinear data, realizes good Dimensionality Reduction.But the computation complexity of this algorithm is larger, and need to choose neighbour, larger on result impact.
2.3) basic thought of LE algorithm is the data point of apart from each other in higher dimensional space, after Dimensionality Reduction in low-dimensional embedded space apart from each other still; Nearer data point, still nearer apart after Dimensionality Reduction; Object is that the local geometry between data point is maintained at lower dimensional space, and can best expressing.LE algorithm has used spectral graph theory, by different weights are set between each data, the local neighborhood information between data can be embodied preferably in lower dimensional space.It is constant that LE algorithm has guaranteed that data point that geometry is neighbour projects to lower dimensional space neighbor positions relation from higher dimensional space, can effectively extract the stream shape feature of data.But for geometric position data point far away, project to after lower dimensional space still apart from each other, easily produce cavitation.
2.4) NPE algorithm is the linear proximity of LLE algorithm from seeing in essence, and data set is dropping to after lower dimensional space, keeps the intrinsic local neighborhood stream shape of data constant, obtains the projection matrix from higher dimensional space to lower dimensional space; While running into new samples, can directly obtain low-dimensional embedding data by projection matrix.NPE algorithm keeps the local neighborhood structure of data constant, extracts the low dimensional manifold structure in high dimensional data, and can more effectively process new samples, realizes Dimensionality Reduction.But its impact that chosen by neighbour is larger, and being linear dimension reduction method, is not fine to the effect data of nonlinear organization.
2.5) main thought of LPP algorithm is in lower dimensional space, to keep the neighbor relationships of legacy data in higher dimensional space constant, its essence is LE algorithm is carried out to linearization.LPP algorithm has clear and definite projection matrix, can directly new samples be projected in lower dimensional space.LPP algorithm can be processed new samples problem well, and can keep preferably the local geometry of data, realizes Data Dimensionality Reduction.But LPP has only considered the local relation of data, chosen by neighbour to affect larger, and affected by noise large.
Two, the sorting technique of high-spectrum remote sensing data
In target in hyperspectral remotely sensed image, the difference of different atural objects is expressed by spectral information and the geometric space information of pixel, and meanwhile, different types of ground objects has different spectral informations and geometric space characteristic.Hyperspectral Remote Sensing Imagery Classification is exactly take the spectrum of image picture element and space characteristics as basis, and the different classes of atural object of representative in the pixel group of each pixel or comparison homogeneous is carried out to determining of category attribute and mark.By the spectral information to each atural object in target in hyperspectral remotely sensed image and geometric space information, analyze, obtain the feature of separability maximum, select suitable categorizing system, each pixel is divided in corresponding category attribute.Traditional sorting technique mainly contains several as follows.
1, k arest neighbors classification (k-NN)
K nearest neighbor algorithm (k-Nearest Neighbor, k-NN) be a kind of more classical sorting technique, the method is judged the classification information at unknown number strong point by k known neighbour, its theory is very ripe, in pattern classification, is widely applied.K-NN algorithm does not need sample training, affected by noise less, but k value is larger on the result impact of algorithm, and the selection of its value need to be adjusted repeatedly, and when dimension is higher, computing time is longer.
2, spectrum angle drawing classification (SAM)
Spectrum angle drawing (Spectral Angle Mapping, SAM) be the one tolerance to object spectrum waveform similarity, it is all considered as every spectrum the vector in wave spectrum space, by calculating angle between unknown spectrum and known spectra, determine the similarity degree between spectrum, and then determine the classification information of the unknown curve of spectrum.Because less expression two curves of spectrum of the angle between the curve of spectrum are more similar, the possibility that belongs to similar spectrum is larger, thereby can determine classification under the unknown curve of spectrum by calculating angle between the curve of spectrum.
SAM is an algorithm based on statistical nature, irrelevant with the mould of spectrum vector, has good antijamming capability, affected by illumination variation little, and be subject to the impact of " the different spectrum of jljl " phenomenon little, now in spectral data classification, be widely used, there is good classifying quality.But SAM algorithm is when two kinds of curve of spectrum similarity degrees are very large, more difficult to curve of spectrum discrimination, can not obtain good classifying quality.
3, support vector machine classification (SVM)
Support vector machine (Support Vector Machines, SVM) be theoretical take the VC dimension in statistics and structure risk minimum principle as theoretical foundation, according to limited sample information, between the complexity (being the study precision of specific training sample) of model and learning ability (i.e. the ability of correct identification arbitrary sample), seek optimal compromise, to obtain best Generalization Ability.SVM algorithm is the optimal classification face between data of seeking take statistics as basis, by nonlinear data being mapped to kernel function space, makes its linearization, and then has simplified computation complexity, has good classifying quality; But How to choose subspace and set up appropriate model become SVM use difficult point.
By above-mentioned prior art introduction, can find out, current feature extracting method and sorting technique all exist not enough separately, feature extracting method can not extract diagnostic characteristics effectively, several sorting techniques or influence factor is many, there is some limitation, both can cause classification results accuracy to be affected.
Summary of the invention
For prior art above shortcomings, the object of this invention is to provide one and can more effectively extract diagnostic characteristics, and classification results is more accurate, the Hyperspectral Remote Sensing Imagery Classification method based on local light spectral corner stream of measurements shape neighbour of the terrain classification better effects if to target in hyperspectral remotely sensed image.
To achieve these goals, the technical solution used in the present invention is as follows:
Based on local light spectral corner stream of measurements shape neighbour's Hyperspectral Remote Sensing Imagery Classification method, it is characterized in that: its step is,
1) sample of choosing at random some from data centralization is as training sample, and its classification information is known, then the sample of choosing at random some is as test sample book;
2) adopt Euclidean distance to differentiate neighbour on a large scale, choose m the training sample point conduct neighbour N on a large scale with each training sample point Euclidean distance minimum o;
3) at neighbour N on a large scale oin, utilize formula (1) to calculate the spectrum angle of each training sample point and all the other training sample points, obtain k training sample point of spectrum angle minimum as accurate neighbour N a, wherein k < m;
&theta; ( X i , X j ) = cos - 1 ( | ( X i , X j ) ( X i , X i ) ( X j , X j ) | ) - - - ( 1 )
Wherein: θ (X i, X j) expression two curves of spectrum vector X iand X jbetween angle, cos -1() represents arc cosine computing, || represent signed magnitude arithmetic(al);
4) according to accurate neighbour N a, utilize formula (2) to carry out Partial Reconstruction to each training sample data point, make reconstructed error minimum, obtain Partial Reconstruction weight matrix W;
min &epsiv; ( W i ) = &Sigma; i = 1 N | | x i - &Sigma; j = 1 N w ij x j | | 2 - - - ( 2 )
Wherein: w ijfor x iwith x jbetween weights, and
Figure BDA0000458434780000052
if x jfor x ineighbor point, have w ij≠ 0, otherwise w ij=0;
5) in lower dimensional space, keep local neighbor relation constant, reconstruct weights are constant, and the low-dimensional of utilizing formula (3) can obtain training sample point embeds result Y;
min &epsiv; ( y i ) = &Sigma; i = 1 N | | y i - &Sigma; j = 1 N w ij y j | | 2 = &Sigma; i = 1 N | | YI i - YW i | | 2 = tr ( YMY T ) - - - ( 3 )
Wherein: I ifor the i row of unit matrix, W ifor the Partial Reconstruction weights of data point i, W=[W 1, W 2, L, W n] t, M=(I-W) (I-W) t, and M belongs to symmetrical, positive semidefinite matrix; Constraint condition is:
Figure BDA0000458434780000054
i is unit matrix;
6) in training sample, add a test sample book point, first by Euclidean distance, obtained the individual neighbour on a large scale of m of this test sample book point, calculate respectively again this test sample book point and individual neighbour's the spectrum angle on a large scale of this m, k neighbour of spectrum angle minimum is the accurate neighbour of this test sample book point, wherein k < m, by k accurate neighbour, test sample book is carried out to Partial Reconstruction, and keep reconstructed error minimum, obtain reconstruct weights, the low-dimensional of finally utilizing this k the corresponding low-dimensional embedding of accurate neighbour and reconstruct weights to carry out linear expression test sample book embeds, and then the low-dimensional that obtains test sample book embeds result,
7) utilize sorter, the test sample book data according to the training sample after Dimensionality Reduction and classification information thereof after to Dimensionality Reduction are classified, and can obtain the classification information of test sample book.
The 7th) sorter of step is classified as follows, first by Euclidean distance, obtains the data point y after Dimensionality Reduction im neighbour N o; Calculate respectively again y iwith neighbour N othe spectrum angle of interior each data point, finally handle and y ithe data point of spectrum angle minimum is as y iclassification; Its process is as follows:
1. by Euclidean distance, obtain the data point y after test sample book Dimensionality Reduction im neighbour N in data after all training sample Dimensionality Reductions o, shown in (4):
N O = { min j = 1 n ( D ( y i , y j C j ) ) m } - - - ( 4 )
Wherein: min () mrepresent to get m minimum value; D () represents compute euclidian distances; y ifor unknown categorical data; for known class data, classification is C j;
2. at neighbour N oin, according to spectrum angle, obtain y iaffiliated classification, shown in (5):
l i = min j = 1 m ( &theta; ( y i , y j C j | N O ) ) - - - - ( 5 )
Wherein: l ifor unknown data y ithe classification obtaining;
Figure BDA0000458434780000064
for N oc in neighbour jclass data.
The present invention proposes a kind of local light spectral corner tolerance neighbour method, the method first obtains a large-scale neighbour by Euclidean distance, in this neighbour, by spectrum angle, obtain neighbour accurately again, the neighbour who obtains so more can be reflected local geometry, the neighbour who obtains is more accurate, make the feature extracting more can embody the intrinsic characteristic of data, for follow-up data, classify more reliable diagnostic characteristics is provided.
The present invention proposes a kind of local light spectral corner nearest neighbor classifier, this sorter first obtains a large-scale neighbour by Euclidean distance, in this neighbour, by classification under the data of spectrum angle minimum, judge again the classification of unknown data, so more can reflect the neighbor relationships of homogeneous data, increase the probability that accurately judges unknown data classification, the stability of classification is better simultaneously, can improve terrain classification effect.
Therefore, the Hyperspectral Remote Sensing Imagery Classification method based on local light spectral corner stream of measurements shape neighbour that the present invention proposes, can more effectively extract diagnostic characteristics, and classification results is more accurate, to the terrain classification better effects if of target in hyperspectral remotely sensed image.Experimental result on Indian Pine and KSC high-spectral data collection shows: this sorting algorithm can more effectively extract diagnostic characteristics, and classification results is more accurate, to the terrain classification better effects if of target in hyperspectral remotely sensed image.
Accompanying drawing explanation
Neighbor relationships figure between Fig. 1-each data point.
The spectral curve of a Fig. 2-five KSC data set subband.
Fig. 3-classification process figure of the present invention.
Fig. 4-Indian Pine target in hyperspectral remotely sensed image figure.
Fig. 5-each algorithm is the overall classification accuracy figure under different value of K different dimensions to Indian Pine data set.
The every class terrain classification result figure of Fig. 6-distinct methods to Indian Pine data set.
Fig. 7-KSC target in hyperspectral remotely sensed image figure.
Fig. 8-each algorithm is the overall classification accuracy figure under different value of K different dimensions to KSC data set.
The every class terrain classification result figure of Fig. 9-distinct methods to KSC data set.
Embodiment
Below in conjunction with accompanying drawing, the present invention is described in detail.
Because this method feature extraction realizes based on LLE algorithm, apply to spectrum angle principle simultaneously, therefore feature extracting method of the present invention is referred to as local light spectral corner LLE algorithm, represent with LSA-LLE (Local Linear Embedding based on Local Spectral Angle), sorting technique of the present invention is referred to as local light spectral corner nearest neighbor classifier, represents with LSANN (Local Spectral Angle Nearest Neighbor).In order feature extracting method more of the present invention to be understood, first LLE algorithm and spectrum angle principle are introduced below.
LLE algorithm principle
The main thought of LLE algorithm is that nonlinear organization data present linear structure at subrange, keeps the linear structure relation of this part, discloses low dimensional manifold structure from high dimensional data, and then extracts the stream shape feature of each data, realizes Dimensionality Reduction.Its main method is each k Neighbor Points linear expression for data point in higher dimensional space, the brief rear maintenance of dimension is constant with the weights of each corresponding Neighbor Points, the data point of the brief rear correspondence of reconstruct dimension, makes reconstructed error minimum, can effectively extract the stream shape feature of nonlinear organization data.
The key step of LLE algorithm is as follows:
1. look for data point x ik Neighbor Points.
First method: use k nearest neighbour method to obtain data point x ik Neighbor Points, calculate x iand the Euclidean distance between remainder data point, using k minimum distance data point as Neighbor Points.
Second method: use ε method to put x by computational data iwith x jbetween Euclidean distance, if Euclidean distance is less than a certain threshold epsilon, think x jx ineighbour; Otherwise, x jnot x ineighbour.
2. calculate Partial Reconstruction weights W.
In higher dimensional space, N data point x ican, by the approximate linear expression of its a k Neighbor Points, make data point x ireconstructed error minimum ask Partial Reconstruction weights W i, its objective function is:
min &epsiv; ( W i ) = &Sigma; i = 1 N | | x i - &Sigma; j = 1 N w ij x j | | 2 - - - ( 6 )
Wherein: w ijfor x iwith x jbetween weights, and
Figure BDA0000458434780000082
if x jfor x ineighbor point, have w ij≠ 0, otherwise w ij=0.
According to constraint condition, to formula (6), by Lagrange multiplier method, can ask Partial Reconstruction weights W.
3. calculate low-dimensional and embed result Y.
X ik neighbour drop to after lower dimensional space, as y ik neighbour, keep Partial Reconstruction weights W constant, in lower dimensional space, make reconstructed error minimum ask y i, its objective function is:
min &epsiv; ( y i ) = &Sigma; i = 1 N | | y i - &Sigma; j = 1 N w ij y j | | 2 - - - ( 7 )
For obtaining unique solution, increase constraint: (a) making the center of gravity after projection is true origin, arrange
Figure BDA0000458434780000084
(b) data point after normalization Dimensionality Reduction, arranges
Figure BDA0000458434780000085
i is unit matrix.
By formula (7), can be obtained:
min &epsiv; ( Y ) = &Sigma; i = 1 N | | YI i - YW i | | 2 = tr ( Y ( I - W ) ( I - W ) T Y T ) = tr ( YMY T ) - - - ( 8 )
Wherein: I ifor the i row of unit matrix, W ifor the Partial Reconstruction weights of data point i, W=[W 1, W 2..., W n] t, M=(I-W) (I-W) t, and M is symmetrical, positive semidefinite matrix.
According to constraint condition, formula (8) is obtained by Lagrange multiplier method:
MY T=λY T (9)
Data drop in d dimension space, for making reconstructed error minimum, by asking the eigenwert of M, cast out almost nil eigenwert, then get d minimum eigenwert, and the matrix of characteristic of correspondence vector composition is embedded to result Y as low-dimensional.
From the calculation process of LLE algorithm, first LLE algorithm will look for k Neighbor Points of each data point.Neighbour count k value representation neighbour's number of each data point reconstruct, the difference of reconstruct neighbour number will cause the difference of reconstruct weights, and then affects the result that low dimensional manifold embeds, i.e. the inherent diagnostic characteristics that extracts of impact.If it is too small that neighbour counts k, the each data point of reconstruct well, thus affect the result of Dimensionality Reduction; If it is excessive that neighbour counts k, will there is over-fitting phenomenon, make each data point can not obtain effective reconstruct, and then can not get good Dimensionality Reduction result.Under identical k value, same data point is chosen to neighbour by different modes, the neighbour who obtains can be not identical, thereby extract different diagnostic characteristicses, makes the result of final Dimensionality Reduction not identical yet.Therefore, neighbour's number and choosing method are larger to the performance impact of LLE algorithm, and this is also the difficult point of using LLE algorithm.
Spectrum angle principle
Conventionally the curve of spectrum corresponding each data point is regarded as to the vector in higher dimensional space, the angle between two curve of spectrum vectors is called spectrum angle.Spectrum angle is an amount based on statistical nature, can reflect the similarity degree between data light spectral curve, and then embodies the similarity relation between data, in Data classification, obtained using widely, and the effect having had.Its mathematical computations is to obtain by the included angle cosine value between two vectorial Xi and Xj.
The included angle cosine of two vectors is defined as:
cos ( &theta; ) ( X i , X j ) ( X i , X i ) ( X j , X j ) - - - ( 10 )
Wherein: cos (θ) is expressed as the cosine value of angle between two curve of spectrum vectors, and () is expressed as the inner product operation between two vectors.
The angle that through type (10) obtains between two curves of spectrum is:
&theta; ( X i , X j ) = cos - 1 ( | ( X i , X j ) ( X i , X i ) ( X j , X j ) | ) - - - ( 11 )
Wherein: θ (X i, X j) expression two curves of spectrum vector X iand X jbetween angle, cos -1() represents arc cosine computing, || represent signed magnitude arithmetic(al);
Spectrum angle is as the angle between curve of spectrum vector, is worth littlely, represents that two curves of spectrum are more similar, and correlativity is larger, and the possibility that belongs to similar spectrum is larger; Be worth larger, represent two curves of spectrum differ larger, correlativity is less, the possibility that belongs to similar spectrum is less.The similar curve of spectrum is closely similar, and the correlativity between its spectrum is strong, and the angle between the curve of spectrum is just relatively little; The non-similar curve of spectrum differs larger, and the correlativity between spectrum is little, and the angle between the curve of spectrum is just relatively large.Meanwhile, the mould of spectrum angle and spectrum vector is irrelevant, similar in shape of a comparison spectrum, and it is little that it is affected by illumination variation, and be subject to the impact of " the different spectrum of jljl " phenomenon little.Therefore, by the angle between the curve of spectrum, can distinguish preferably the classification that spectrum belongs to, in spectral classification, be used widely, and there is good classifying quality.But spectrum angle is in global data, and calculated amount is larger, and between the stronger two class atural objects of correlativity, spectrum angle is also relatively little, makes class discrimination more difficult.Therefore, use global data to calculate the performance at spectrum angle, the effect that sometimes can not obtain, has proposed local light spectral corner algorithm for this problem the application.
LSA measures neighbour
LLE algorithm need first be looked for the neighbour of each data, but the neighbour that distinct methods obtains is very large on final classifying quality impact.Classic method is used euclidean distance metric neighbour, but it only represents the air line distance of point-to-point transmission, in higher dimensional space, differ and reflect surely the true geometric structural relation between data, Euclidean distance is at a distance of its space geometry distribution possibility wide apart of nearer point, likely using the point of geometric position apart from each other as neighbour, simultaneously, geometric position in higher dimensional space between non-like number strong point is at a distance of far away, and then easily using non-like number strong point as neighbour, the diagnostic characteristics of extraction can not reflect the intrinsic characteristic of Various types of data well.
Because the similarity between similar object spectrum is large, its degree of correlation is large, and spectrum angle is less.First by Euclidean distance, obtain large-scale neighbour, then by spectrum angular metric neighbour, get rid of the non-homogeneous data of part within the scope of this, making neighbour is that similar probability is larger, and it is better that neighbour chooses stability, affected by noise less.Meanwhile, local light spectral corner has reduced the impact of global data on spectrum angle, and calculated amount is less.
Fig. 1 is the neighbor relationships distribution plan between each data point, first adopts euclidean distance metric neighbour, and the neighbour who obtains an A is the point in broken circle, and now non-similar some C is also judged as the neighbour of an A; Adopt the angle tolerance neighbour between vector, between an A, B and C, although Euclidean distance AC is less than AB, angle AOC (O is true origin) is greater than angle AOB, just the neighbour of non-similar point is got rid of again, and the neighbour who obtains an A is B.If only adopt euclidean distance metric neighbour, the neighbour who obtains an A is a C.Therefore, adopt local light spectral corner tolerance neighbour to reduce non-similar as neighbour, increased neighbour for similar probability, the neighbour who obtains like this can more effectively extract diagnostic characteristics.
Fig. 2 is the spectral curve that KSC high-spectrum remote sensing data is concentrated five data point a, b, c, d and e choosing, wherein a, b, c are same class data point, d, e are another kind of data point, and for curve of spectrum clear display, the curve of spectrum of only having chosen 40~90 wave bands in figure shows.Two Neighbor Points that now need to find out data point a, first adopt traditional neighbour's measure, and the Euclidean distance calculating between data point a and remainder data point is:
D(a,b)=105.9953,D(a,c)=127.9844 (12)
D(a,d)=118.0466,D(a,e)=136.0074
According to the large I of Euclidean distance of formula (12), obtain the neighbour in a big way of data point a, choose data point b, c that three Euclidean distances are less, d as neighbour, then the size at the spectrum angle between computational data point a and this three data points be:
θ(a,b)=0.0449,θ(a,c)=0.0457,θ(a,d)=0.0487 (13)
According to the large I in spectrum angle of formula (13), obtaining the accurate neighbour of data point a, select two less data points of spectrum angle for neighbour, is data point b and c (b, c are similar) so obtain the neighbour of data point a.If directly adopting the neighbour of traditional euclidean distance metric data point a is data point b and d (b, d are non-similar).Therefore, the probability that the neighbour who obtains by local light spectral corner is homogeneous data is larger, and this neighbour is more conducive to the feature extraction of data.
Hence one can see that, first by traditional euclidean distance metric neighbour, obtain neighbour in a big way, in this neighbour, adopt again spectrum angular metric neighbour and then obtain neighbour accurately, can reduce using non-like number strong point as neighbour, increasing neighbour is the probability at like number strong point, neighbour is chosen more accurate, can more effectively extract the inherent diagnostic characteristics of homogeneous data, improve final classifying quality.Therefore,, in conjunction with LLE algorithm, the present invention proposes LSA-LLE algorithm and LSANN sorting algorithm.
LSA-LLE algorithm principle
The key distinction of LSA-LLE algorithm and traditional LLE algorithm is in tolerance neighbour's mode, LLE is by euclidean distance metric neighbour for tradition, LSA-LLE algorithm first obtains a large-scale m neighbour by traditional Euclidean distance, in m neighbour, by spectrum angle, obtain k neighbour accurately again, wherein k < m, the last Partial Reconstruction of realizing each data point by k neighbour, minimizes reconstructed error, and in lower dimensional space, keep Partial Reconstruction Information invariability, and then obtain low-dimensional embedding manifold structure.Its algorithmic procedure is as follows:
1. in data centralization, computational data point x iwith the Euclidean distance of each data point, m the data point that obtains Euclidean distance minimum is as neighbour N on a large scale o;
2. at neighbour N on a large scale oin, calculate x iwith the spectrum angle of each data point, obtain k some conduct neighbour N accurately of spectrum angle minimum a;
3. basis neighbour N accurately a, each data point is carried out to Partial Reconstruction, make reconstructed error minimum obtain Partial Reconstruction weight matrix W.Its objective function is:
min &epsiv; ( W i ) = &Sigma; i = 1 N | | x i - &Sigma; j = 1 N w ij x j | | 2 - - - ( 14 )
Wherein: w ijfor x iwith x jbetween weights, and if x jfor x ineighbor point, have w ij≠ 0, otherwise w ij=0.
According to constraint condition, to formula (14), with Lagrange Multiplier Method, can ask Partial Reconstruction weights W.
4. in lower dimensional space, keep Partial Reconstruction Information invariability, minimum reconstructed, obtains low-dimensional and embeds result Y.Its objective function is:
min &epsiv; ( y i ) = &Sigma; i = 1 N | | y i - &Sigma; j = 1 N w ij y j | | 2 = &Sigma; i = 1 N | | YI i - YW i | | 2 = tr ( YMY T ) - - - ( 15 )
Wherein: I ifor the i row of unit matrix, W ifor the Partial Reconstruction weights of data point i, W=[W 1, W 2..., W n] t, M=(I-W) (I-W) t, and M belongs to symmetrical, positive semidefinite matrix.
For obtaining unique solution, increase constraint: (a) making the center of gravity after projection is true origin, arrange
Figure BDA0000458434780000121
(b) data point after normalization Dimensionality Reduction, arranges
Figure BDA0000458434780000122
i is unit matrix.
According to constraint condition, formula (15) is obtained with Lagrange Multiplier Method:
MY T=λY T (16)
Drop in d dimension space, for guaranteeing reconstructed error minimum, by asking the eigenwert of M, cast out almost nil eigenwert, then get d minimum eigenwert, and the matrix of characteristic of correspondence vector composition is embedded to result Y as low-dimensional.
LSANN principle of classification
Traditional nearest neighbor classifier is the classification of getting Euclidean distance minimum between new samples and known sample and judge new samples.Euclidean distance in higher dimensional space between non-homogeneous data may be less than homogeneous data, the classification of some data point may be misdeemed with nearest neighbor classifier.Therefore, the present invention proposes local light spectral corner nearest neighbor classifier, the probability that accurately judges unknown data classification is increased, the stability of classification is better simultaneously, can the fine terrain classification effect of improving.
Local light spectral corner nearest neighbor classifier main process is: first by Euclidean distance, obtain data point y im neighbour N o, suc as formula (17); Calculate again y iwith neighbour N othe spectrum angle of interior each data point, finally handle and y ithe data point of spectrum angle minimum is as y iclassification, suc as formula (18).
N O = { min j = 1 n ( D ( y i , y j C j ) ) m } - - - ( 17 )
Wherein: min () mrepresent to get m minimum value; D () represents compute euclidian distances; y ifor unknown categorical data;
Figure BDA0000458434780000124
for known class data, classification is C j.
l i = min j = 1 m ( &theta; ( y i , y j C j | N O ) ) - - - - ( 18 )
Wherein: l ifor unknown data y ithe classification obtaining;
Figure BDA0000458434780000126
for N oc in neighbour jclass data.
By above-mentioned introduction, can find out that LSA-LLE algorithm of the present invention is first to adopt traditional Euclidean distance to obtain large-scale neighbour, by spectrum angle, obtain neighbour accurately again, by neighbour, carry out Partial Reconstruction, and make reconstructed error minimum, in lower dimensional space, keep Partial Reconstruction mode constant, minimum reconstructed, and then extract the inherent diagnostic characteristics in high dimensional data; LSANN sorter is first by Euclidean distance, to obtain the neighbour of new samples, then calculates the spectrum angle between new samples and neighbour, new samples is classified as to the class of spectrum angle minimum.Hyperspectral Remote Sensing Imagery Classification method of the present invention, its process flow diagram is Fig. 3, concrete steps are:
1) sample of choosing at random some from data centralization is as training sample, and its classification information is known, then the sample of choosing at random some is as test sample book;
2) adopt traditional Euclidean distance to differentiate neighbour, obtain m the training sample point conduct neighbour N on a large scale with each training sample point Euclidean distance minimum o;
3) at neighbour N on a large scale oin, utilize formula (19) to calculate the spectrum angle of each training sample point and all the other training sample points, obtain k training sample point of spectrum angle minimum as accurate neighbour N a, wherein k < m;
&theta; ( X i , X j ) = cos - 1 ( | ( X i , X j ) ( X i , X i ) ( X j , X j ) | ) - - - ( 19 )
Wherein: θ (X i, X j) expression two curves of spectrum vector X iand X jbetween angle, cos -1() represents arc cosine computing, || represent signed magnitude arithmetic(al);
4) according to accurate neighbour N a, utilize formula (20) to carry out Partial Reconstruction to each training sample data point, make reconstructed error minimum, obtain Partial Reconstruction weight matrix W;
min &epsiv; ( W i ) = &Sigma; i = 1 N | | x i - &Sigma; j = 1 N w ij x j | | 2 - - - ( 20 )
Wherein: w ijfor x iwith x jbetween weights, and if x jfor x ineighbor point, have w ij≠ 0, otherwise w ij=0;
5) in lower dimensional space, keep local neighbor relation constant, reconstruct weights are constant, and the low-dimensional of utilizing formula (21) can obtain training sample point embeds result Y;
min &epsiv; ( y i ) = &Sigma; i = 1 N | | y i - &Sigma; j = 1 N w ij y j | | 2 = &Sigma; i = 1 N | | YI i - YW i | | 2 = tr ( YMY T ) - - - ( 21 )
Wherein: I ifor the i row of unit matrix, W ifor the Partial Reconstruction weights of data point i, W=[W 1, W 2..., W n] t, M=(I-W) (I-W) t, and M belongs to symmetrical, positive semidefinite matrix;
6) in training sample, add a test sample book point, first by Euclidean distance, obtained the individual neighbour on a large scale of m of this test sample book point, calculate respectively again this test sample book point and individual neighbour's the spectrum angle on a large scale of this m, k neighbour of spectrum angle minimum is the accurate neighbour of this test sample book point, wherein k < m, by k accurate neighbour, test sample book is carried out to Partial Reconstruction, and keep reconstructed error minimum, obtain reconstruct weights, the low-dimensional of finally utilizing this k the corresponding low-dimensional embedding of accurate neighbour and reconstruct weights to carry out linear expression test sample book embeds, and then the low-dimensional that obtains test sample book embeds result,
7) utilize sorter, the test sample book data according to the training sample after Dimensionality Reduction and classification information thereof after to Dimensionality Reduction are classified, and can obtain the classification information of test sample book.The described the 7th) sorter of step is classified as follows, first by Euclidean distance, obtains the data point y after Dimensionality Reduction im neighbour N o; Calculate respectively again y iwith neighbour N othe spectrum angle of interior each data point, finally handle and y ithe data point of spectrum angle minimum is as y iclassification.Its process is as follows:
1. by Euclidean distance, obtain the data point y after test sample book Dimensionality Reduction im neighbour N in data after all training sample Dimensionality Reductions o, shown in (4):
N O = { min j = 1 n ( D ( y i , y j C j ) ) m } - - - ( 22 )
Wherein: min () mrepresent to get m minimum value; D () represents compute euclidian distances; y ifor unknown categorical data;
Figure BDA0000458434780000142
for known class data, classification is C j;
2. at neighbour N oin, according to spectrum angle, obtain y iaffiliated classification, shown in (5):
l i = min j = 1 m ( &theta; ( y i , y j C j | N O ) ) - - - - ( 23 )
Wherein: l ifor unknown data y ithe classification obtaining;
Figure BDA0000458434780000144
for N oc in neighbour jclass data.
Above-mentioned known data point is training sample data, and unknown data is test sample book.
For checking LSA-LLE algorithm and the terrain classification effect of LSANN sorting algorithm to target in hyperspectral remotely sensed image, the present invention has chosen Indian Pine and KSC high-spectrum remote sensing data collection carries out terrain classification experiment.When KSC and Indian Pine data set carry out terrain classification experiment, LSA-LLE+LSANN and LLE+LSANN, LSA-LLE, LLE, LPP and NPE that this chapter is proposed compare respectively, and wherein LSAN-LLE, LLE, LPP and NPE adopt traditional nearest neighbor classifier classification.
Experiment is set to: for research neighbour counts k, change the impact on experimental result, it is 3,4,5,6,7,8 that k has been chosen in experiment; For research local light spectral corner tolerance neighbour and euclidean distance metric neighbour's contrast effect, LPP, NPE, LLE and LSAN-LLE have been carried out to contrast experiment under each k value; For the classifying quality of research LASNN sorter, LLE, LLE+LSANN, LSAN-LLE and LSAN-LLE+LSANN have been carried out to contrast experiment under each k value; For more final classifying quality, each algorithm has been carried out to terrain classification contrast experiment.
Indian Pine data set
Indian Pine target in hyperspectral remotely sensed image data combine in and by NASA, adopt the high spectrum sensor of AVIRIS to obtain on June 12nd, 1992, this image is comprised of 145 × 145 pixels, comprised that wavelength coverage is that 400nm~2450nm has 220 wave bands, image has covered the terrestrial object information within the scope of 100 sq-kms of the Indiana, USA northwestward, removed and absorbed and the larger wave band of noise effect by atmospheric water, comprise (104~108,150~163,220) totally 20 wave bands, remaining 200 wave bands for experimental study.Fig. 4 is false cromogram and the true atural object distribution situation thereof of Indian Pine target in hyperspectral remotely sensed image.In the experiment of the LSA-LLE+LSANN algorithm that this chapter is proposed, chosen 6 kinds of common atural object experiment Analysis, its atural object classification information is as shown in table 1.
The classification information of table 1Indian Pine data set
Figure BDA0000458434780000151
Experiment parameter is set to: to asking for the experiment of nicety of grading, each experiment is all chosen at random 40 data points as training sample from every class atural object, 100 data points are as test sample book, the data point of at every turn choosing is all tested every kind of algorithm respectively, altogether carry out repeating for 10 times experiment, and using the mean value of 10 experiments as net result; When selected whole atural object is classified, every class is chosen 40 data points at random as training sample, and choosing accurate neighbour, to count k be 6, and low-dimensional embeds dimension and is chosen as 40, owing to embedding dimension in low-dimensional, be 40 o'clock, the overall classification accuracy variation of each algorithm reaches steady substantially.LSA-LLE algorithm on a large scale neighbour is counted m than accurate neighbour, to count k large 10, and it is 5 that the neighbour on a large scale of LSANN algorithm counts m.
The highest overall classification accuracy (mean value ± variance (%) (dimension)) of the each algorithm of table 2 to Indian Pine data set
Figure BDA0000458434780000152
Figure BDA0000458434780000161
Fig. 5 has provided the result that various algorithm overall classification accuracies change with k value and dimension, and table 2 has provided mxm. and the variance thereof of algorithms of different overall classification accuracy under different value of K.According to Fig. 5 and table 2, under different value of K, can obtain drawing a conclusion:
1. along with k value increases, except indivedual algorithms, overall classification accuracy and the highest overall classification accuracy of each algorithm present continuous increase substantially, illustrate that the larger reconstruct of neighbour's number is more accurate, and error is less, and overall classification accuracy is just better.
2. each algorithm overall classification accuracy is along with the increase that low-dimensional embeds dimension constantly increases, finally tend to be steady, owing to increasing contained information along with low-dimensional embedding dimension, more enrich also increase thereupon of overall classification accuracy, when information is enough enriched, overall classification accuracy tends to be steady.
3. LSA-LLE algorithm is all better than the overall classification accuracy of LLE algorithm and the highest overall classification accuracy, illustrate that local light spectral corner tolerance neighbour is than euclidean distance metric neighbour's better effects if, local light spectral corner tolerance neighbour is that the probability at like number strong point is larger, can extract better the inherent diagnostic characteristics between like number strong point.
4. LLE+LSANN is better than the overall classification accuracy of LLE and the highest overall classification accuracy, LSA-LLE+LSAN is better than LSA-LLE classifying quality, along with the variation of dimension is more steady, illustrate that LSANN algorithm is more accurate than the classification results of nearest neighbor classifier, stability is better.
5. LSA-LLE+LSAN algorithm combines LSA-LLE and LSANN, and all the other algorithms that its nicety of grading is mentioned than this chapter are all good, and classification results is more stable, affected by noise less; Along with k is 3~8 variations, the highest overall classification accuracy of LSA-LLE+LSANN has promoted respectively 2.70%, 3.12%, 1.60%, 1.83%, 2.38%, 2.23% than LSA-LLE, than LLE+LSANN, promoted respectively 3.23%, 1.45%, 0.82%, 1.70%, 1.08%, 1.12%, than LLE, promoted respectively 4.45%, 4.73%, 4.83%, 3.92%, 3.45%, 3.28%.
Fig. 6 has provided the classification results of all kinds of atural objects of algorithms of different, according to Fig. 6, can obtain as drawn a conclusion:
1. the LSA-LLE algorithm that the application proposes is more effective than LLE algorithm classification, and the diagnostic characteristics that adopts local light spectral corner tolerance neighbour can more effectively extract each data is described.
2. LLE+LSANN algorithm, than the good classification effect of LLE algorithm, illustrates that the classifying quality of employing LSANN sorter is better than nearest neighbor classifier classifying quality.
3. other algorithm classification effects of mentioning than the application in conjunction with the LSA-LLE+LSANN algorithm of LSA-LLE and LSANN are all good, illustrate that LSA-LLE+LSANN can more effectively extract diagnostic characteristics, realizes more effective terrain classification.
The every class terrain classification precision (%) of table 3 distinct methods to Indian Pine data set
Figure BDA0000458434780000171
Table 3 has provided the nicety of grading of all kinds of atural objects under distinct methods, according to table 3, can obtain drawing a conclusion:
1. LSA-LLE algorithm is more steady to the nicety of grading of all kinds of atural objects, fine to some category classification effect unlike LLE algorithm, some classifying quality is very poor, LSA-LLE algorithm has increased the stability of classification compared with LLE algorithm, to the larger C1 of sample number and C5 two class terrain classification precision, improve larger, the classifying quality of all the other all kinds of atural objects differs very little compared with LLE algorithm, and hence one can see that adopts local light spectral corner tolerance neighbour's feature extraction effect better than direct employing euclidean distance metric neighbour.
2. LLE+LSANN algorithm is substantially all good than LLE algorithm to the classifying quality of each atural object, and all kinds of terrain classification ratio of precision LLE algorithm is more steady, and hence one can see that adopts the overall classifying quality of LSANN classification better than adopting arest neighbors to classify.
3. LSA-LLE+LSANN algorithm is the most obvious with C5 two class terrain classification ratio of precision improvement compared with remaining algorithm to the larger C1 of sample number, the classifying quality of all the other all kinds of atural objects is more or less the same, and nicety of grading is more steady compared with all the other algorithms, all the other each algorithms that its total classifying quality is mentioned than this chapter are all good, illustrate that LSA-LLE+LSANN algorithm combines LSANN-LLE and LSANN is more favourable to terrain classification, overall classifying quality is better.
The classification results of table 4 distinct methods to Indian Pine data set
Figure BDA0000458434780000172
Figure BDA0000458434780000181
Table 4 has provided resultnat accuracy and the Kappa coefficient of algorithms of different terrain classification, by table 4, can be obtained as drawn a conclusion:
1. total nicety of grading of LSA-LLE algorithm and LLE+LSANN algorithm and Kappa coefficient are all good than LLE algorithm, and LSA-LLE algorithm and LSANN algorithm better effects if aspect feature extraction and classification respectively that the application proposes are described.
2. total nicety of grading of LSA-LLE+LSANN algorithm and all the other algorithms of Kappa coefficient ratio are all good, compare with LLE+LSANN with LLE, LSA-LLE, total nicety of grading has promoted respectively 4.95%, 1.98%, 2.24%, and Kappa coefficient has promoted respectively 0.060,0.025,0.028.
KSC data set
KSC target in hyperspectral remotely sensed image data set is by NASA, to be adopted in the aviation flight of imaging spectrometer (AVIRIS) sensor about height above sea level is 20km and obtained on March 23rd, 1996 by American National aviation and space travel office (NASA), this target in hyperspectral remotely sensed image size is 614 × 512 pixels, its spatial resolution can reach 18m, the spectral range adopting is 400~2500nm, 224 wave band numbers have been comprised altogether, image has covered near terrestrial object information U.S. Florida Kennedy, reject the wave band of Atmospheric Absorption and noise effect, remaining 176 wave bands are used for to experimental study.Fig. 7 is false cromogram and the true atural object distribution situation thereof of KSC target in hyperspectral remotely sensed image.In the experiment of the LSA-LLE+LSANN algorithm that the present invention is proposed, chosen 6 kinds of common atural object experiment Analysis, its atural object classification information is as shown in table 5.
The classification information of table 5KSC data set
Figure BDA0000458434780000182
When each parameter of this experiment is set, 40 samples of choosing at random during the present invention is arranged Indian Pine high-spectrum remote sensing data collection parameter all change into and choose at random 20 samples as training sample as training sample, and all the other parameter setting methods are constant.
The highest overall classification accuracy (mean value ± variance (%) (dimension)) of the each algorithm of table 6 to KSC data set
Figure BDA0000458434780000191
Fig. 8 has provided the result that various algorithm overall classification accuracies change with k value and dimension, and table 6 has provided mxm. and the variance thereof of algorithms of different overall classification accuracy under different value of K.According to Fig. 8 and table 6, under different value of K, can obtain drawing a conclusion:
1. along with k value increases, except indivedual algorithms, the overall classification accuracy of all the other each algorithms and the highest overall classification accuracy substantially all constantly increase, and overall classification accuracy is along with the continuous increase of increase that low-dimensional embeds dimension finally tends to be steady.
2. LSA-LLE algorithm and LLE+LSANN algorithm are all better than the overall classification accuracy of LLE algorithm and the highest overall classification accuracy.
3. other algorithm classification effects that the highest overall classification accuracy of LSA-LLE+LSAN algorithm is mentioned than this chapter are all good, along with k is that 3~8 the highest overall classification accuracies that change LSA-LLE+LSANN have promoted respectively 3.98%, 4.80%, 3.33%, 3.65%, 3.05%, 4.97% than LSA-LLE, than LLE+LSANN, promoted respectively 1.27%, 3.75%, 1.05%, 1.78%, 1.68%, 1.68%, than LLE, promoted respectively 6.82%, 6.10%, 6.75%, 5.55%, 5.40%, 6.47%.
Fig. 9 has provided the classification results of all kinds of atural objects of algorithms of different, according to Fig. 9, can obtain as drawn a conclusion:
The LSA-LLE algorithm that the present invention proposes is more effective than LLE algorithm classification, and LLE+LSANN algorithm is than the good classification effect of LLE algorithm, and other algorithm classification effects of mentioning than this chapter in conjunction with the LSA-LLE+LSANN algorithm of LSA-LLE and LSANN are all good.
The every class terrain classification precision (%) of table 7 distinct methods to KSC data set
Figure BDA0000458434780000192
Table 7 has provided the nicety of grading of all kinds of atural objects under distinct methods, can obtain conclusion to be according to table 7:
LSA-LLE algorithm is substantially all good than LLE algorithm effect to the nicety of grading of all kinds of atural objects, and stability is better; LLE+LSANN algorithm is all good than LLE algorithm to the classifying quality of each atural object, and all kinds of terrain classification ratio of precision LLE algorithm is more steady; LSA-LLE+LSANN algorithm is substantially all good than all the other each algorithms to the classifying quality of each atural object, and nicety of grading is more steady compared with all the other algorithms, and overall classifying quality is better.
The classification results of table 8 distinct methods to KSC data set
Figure BDA0000458434780000202
Table 8 has provided resultnat accuracy and the Kappa coefficient of algorithms of different terrain classification, according to table 8, can obtain drawing a conclusion:
Total nicety of grading and the Kappa coefficient of LSA-LLE algorithm and LLE+LSANN algorithm are all better than LLE algorithm; Other algorithms that total nicety of grading of LSA-LLE+LSANN algorithm and the present invention of Kappa coefficient ratio mention are all better, compare with LLE+LSANN with LLE, LSA-LLE, total classification precision improvement 11.87%, 4.69%, 4.98%, Kappa coefficient has promoted 0.142,0.059,0.062.
Known by above-mentioned experimental result, the present invention proposes local light spectral corner than euclidean distance metric neighbour's better effects if, and making neighbour is that similar probability increases, and can more effectively extract the diagnostic characteristics of similar atural object; The LSANN sorter that the present invention proposes is better than the classifying quality of nearest neighbor classifier, and stability is better; LSA-LLE+LSANN algorithm stability compared with other algorithms in conjunction with LSA-LLE and LSANN is better, and noiseproof feature is stronger, and classifying quality is better.
This method can more effectively be improved the terrain classification of target in hyperspectral remotely sensed image, LSA-LLE+LSANN algorithm than total classification precision improvement of LLE, LLE+LSANN and LSA-LLE 1.98%~11.87%, Kappa coefficient has promoted 0.025~0.142.
The present invention is directed to euclidean distance metric neighbour's problem, LSA-LLE algorithm and LSANN sorter have been proposed, LSA-LLE algorithm is first to adopt traditional Euclidean distance to obtain large-scale neighbour, by spectrum angle, obtain neighbour accurately again, by neighbour, carry out Partial Reconstruction, and make reconstructed error minimum, in lower dimensional space, keep Partial Reconstruction mode constant, minimum reconstructed, and then extract the inherent diagnostic characteristics in high dimensional data; LSANN sorter is first by Euclidean distance, to obtain the neighbour of new samples, then calculates the spectrum angle between new samples and neighbour, new samples is classified as to the class of spectrum angle minimum.
From analysis of the present invention and experimental result, the local light spectral corner that the present invention proposes is than euclidean distance metric neighbour better effects if, and making neighbour is that similar probability increases, and can more effectively extract the diagnostic characteristics of similar atural object.The LSANN sorter that the present invention proposes is more accurate than the judgement of nearest neighbor classifier classification, and classifying quality is better, and stability is better.LSA-LLE+LSANN algorithm in conjunction with LSA-LLE and LSANN can more effectively extract diagnostic characteristics, and classifying quality is better, and stability is better compared with other algorithms of mentioning with the present invention, and noiseproof feature is stronger, and classifying quality is better.Terrain classification experimental result on Indian Pine and KSC high-spectrum remote sensing data collection shows: the inventive method can more effectively be improved the terrain classification of target in hyperspectral remotely sensed image, LSA-LLE+LSANN algorithm than total classification precision improvement of LLE, LLE+LSANN and LSA-LLE 1.98%~11.87%, Kappa coefficient has promoted 0.025~0.142.
The above embodiment of the present invention is to be only explanation example of the present invention, and is not the restriction to embodiments of the present invention.For those of ordinary skill in the field, can also make on the basis of the above description other multi-form variation and variations.Here cannot give exhaustive to all embodiments.Everyly belong to apparent variation or the still row in protection scope of the present invention of variation that technical scheme of the present invention amplifies out.

Claims (2)

1. the Hyperspectral Remote Sensing Imagery Classification method based on local light spectral corner stream of measurements shape neighbour, is characterized in that: its step is,
1) sample of choosing at random some from data centralization is as training sample, and its classification information is known, then the sample of choosing at random some is as test sample book;
2) adopt Euclidean distance to differentiate neighbour on a large scale, choose m the training sample point conduct neighbour N on a large scale with each training sample point Euclidean distance minimum o;
3) at neighbour N on a large scale oin, utilize formula (1) to calculate the spectrum angle of each training sample point and all the other training sample points, obtain k training sample point of spectrum angle minimum as accurate neighbour N a, wherein k < m;
&theta; ( X i , X j ) = cos - 1 ( | ( X i , X j ) ( X i , X i ) ( X j , X j ) | ) - - - ( 1 )
Wherein: θ (X i, X j) expression two curves of spectrum vector X iand X jbetween angle, cos -1() represents arc cosine computing, || represent signed magnitude arithmetic(al);
4) according to accurate neighbour N a, utilize formula (2) to carry out Partial Reconstruction to each training sample data point, make reconstructed error minimum, obtain Partial Reconstruction weight matrix W;
min &epsiv; ( W i ) = &Sigma; i = 1 N | | x i - &Sigma; j = 1 N w ij x j | | 2 - - - ( 2 )
Wherein: w ijfor x iwith x jbetween weights, and
Figure FDA0000458434770000013
if x jfor x ineighbor point, have w ij≠ 0, otherwise w ij=0;
5) in lower dimensional space, keep local neighbor relation constant, reconstruct weights are constant, and the low-dimensional of utilizing formula (3) can obtain training sample point embeds result Y;
min &epsiv; ( y i ) = &Sigma; i = 1 N | | y i - &Sigma; j = 1 N w ij y j | | 2 = &Sigma; i = 1 N | | YI i - YW i | | 2 = tr ( YMY T ) - - - ( 3 )
Wherein: I ifor the i row of unit matrix, W ifor the Partial Reconstruction weights of data point i, W=[W 1, W 2, L, W n] t, M=(I-W) (I-W) t, and M belongs to symmetrical, positive semidefinite matrix; Constraint condition is:
Figure FDA0000458434770000015
i is unit matrix;
6) in training sample, add a test sample book point, first by Euclidean distance, obtained the individual neighbour on a large scale of m of this test sample book point, calculate respectively again this test sample book point and individual neighbour's the spectrum angle on a large scale of this m, k neighbour of spectrum angle minimum is the accurate neighbour of this test sample book point, wherein k<m, by k accurate neighbour, test sample book is carried out to Partial Reconstruction, and keep reconstructed error minimum, obtain reconstruct weights, the low-dimensional of finally utilizing this k the corresponding low-dimensional embedding of accurate neighbour and reconstruct weights to carry out linear expression test sample book embeds, and then the low-dimensional that obtains test sample book embeds result,
7) utilize sorter, the test sample book data according to the training sample after Dimensionality Reduction and classification information thereof after to Dimensionality Reduction are classified, and can obtain the classification information of test sample book.
2. the Hyperspectral Remote Sensing Imagery Classification method based on local light spectral corner stream of measurements shape neighbour according to claim 1, is characterized in that: the described the 7th) sorter of step is classified as follows, first by Euclidean distance, obtains the data point y after Dimensionality Reduction im neighbour N o; Calculate respectively again y iwith neighbour N othe spectrum angle of interior each data point, finally handle and y ithe data point of spectrum angle minimum is as y iclassification; Its process is as follows:
1. by Euclidean distance, obtain the data point y after test sample book Dimensionality Reduction im neighbour N in data after all training sample Dimensionality Reductions o, shown in (4):
N O = { min j = 1 n ( D ( y i , y j C j ) ) m } - - - ( 4 )
Wherein: min () mrepresent to get m minimum value; D () represents compute euclidian distances; y ifor unknown categorical data;
Figure FDA0000458434770000022
for known class data, classification is C j;
2. at neighbour N oin, according to spectrum angle, obtain y iaffiliated classification, shown in (5):
l i = min j = 1 m ( &theta; ( y i , y j C j | N O ) ) - - - - ( 5 )
Wherein: l ifor unknown data y ithe classification obtaining;
Figure FDA0000458434770000024
for N oc in neighbour jclass data.
CN201410023922.4A 2014-01-17 2014-01-17 Hyperspectral remote sensing image classification method based on manifold neighbor measurement through local spectral angles Pending CN103729651A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410023922.4A CN103729651A (en) 2014-01-17 2014-01-17 Hyperspectral remote sensing image classification method based on manifold neighbor measurement through local spectral angles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410023922.4A CN103729651A (en) 2014-01-17 2014-01-17 Hyperspectral remote sensing image classification method based on manifold neighbor measurement through local spectral angles

Publications (1)

Publication Number Publication Date
CN103729651A true CN103729651A (en) 2014-04-16

Family

ID=50453715

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410023922.4A Pending CN103729651A (en) 2014-01-17 2014-01-17 Hyperspectral remote sensing image classification method based on manifold neighbor measurement through local spectral angles

Country Status (1)

Country Link
CN (1) CN103729651A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104318537A (en) * 2014-09-30 2015-01-28 中国科学院深圳先进技术研究院 Method and system for detecting and removing raindrop in heavy rain scene video data
CN106886793A (en) * 2017-01-23 2017-06-23 西安电子科技大学 Hyperspectral image band selection method based on discriminant information and manifold information
CN108416355A (en) * 2018-03-09 2018-08-17 浙江大学 A kind of acquisition method of the industry spot creation data based on machine vision
CN109145945A (en) * 2018-07-12 2019-01-04 汕头大学 A kind of hyperspectral image classification method that non local weighting joint sparse indicates
CN110378272A (en) * 2019-07-12 2019-10-25 河海大学 Target in hyperspectral remotely sensed image feature extracting method based on partitioning of matrix Isomap algorithm
CN110490268A (en) * 2019-08-26 2019-11-22 山东浪潮人工智能研究院有限公司 A kind of feature matching method of the improvement nearest neighbor distance ratio based on cosine similarity
CN110619370A (en) * 2019-09-23 2019-12-27 云南电网有限责任公司电力科学研究院 Hyperspectral image super-pixel local linear embedding dimension reduction method
CN111178160A (en) * 2019-12-11 2020-05-19 广州地理研究所 Method and device for determining urban ground feature coverage information
CN112329654A (en) * 2020-11-10 2021-02-05 中国地震局地震预测研究所 Hyperspectral image data classification method and system based on multi-manifold learning algorithm
CN112766227A (en) * 2021-02-04 2021-05-07 中国地质调查局武汉地质调查中心 Hyperspectral remote sensing image classification method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101299237A (en) * 2008-06-05 2008-11-05 北京航空航天大学 High spectroscopic data supervision classifying method based on information quantity dimensionality sequence
WO2010019515A2 (en) * 2008-08-10 2010-02-18 Board Of Regents, The University Of Texas System Digital light processing hyperspectral imaging apparatus
CN101770584A (en) * 2009-12-30 2010-07-07 重庆大学 Extraction method for identification characteristic of high spectrum remote sensing data
CN103065160A (en) * 2013-01-23 2013-04-24 西安电子科技大学 Hyperspectral image classification method based on local cooperative expression and neighbourhood information constraint

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101299237A (en) * 2008-06-05 2008-11-05 北京航空航天大学 High spectroscopic data supervision classifying method based on information quantity dimensionality sequence
WO2010019515A2 (en) * 2008-08-10 2010-02-18 Board Of Regents, The University Of Texas System Digital light processing hyperspectral imaging apparatus
CN101770584A (en) * 2009-12-30 2010-07-07 重庆大学 Extraction method for identification characteristic of high spectrum remote sensing data
CN103065160A (en) * 2013-01-23 2013-04-24 西安电子科技大学 Hyperspectral image classification method based on local cooperative expression and neighbourhood information constraint

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘嘉敏 等: "融合夹角度量的局部线性嵌入算法", 《光电工程》 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104318537A (en) * 2014-09-30 2015-01-28 中国科学院深圳先进技术研究院 Method and system for detecting and removing raindrop in heavy rain scene video data
CN106886793A (en) * 2017-01-23 2017-06-23 西安电子科技大学 Hyperspectral image band selection method based on discriminant information and manifold information
CN106886793B (en) * 2017-01-23 2020-02-07 西安电子科技大学 Hyperspectral image waveband selection method based on discrimination information and manifold information
CN108416355A (en) * 2018-03-09 2018-08-17 浙江大学 A kind of acquisition method of the industry spot creation data based on machine vision
CN109145945B (en) * 2018-07-12 2021-10-29 汕头大学 Hyperspectral image classification method based on non-local weighting and sparse representation
CN109145945A (en) * 2018-07-12 2019-01-04 汕头大学 A kind of hyperspectral image classification method that non local weighting joint sparse indicates
CN110378272A (en) * 2019-07-12 2019-10-25 河海大学 Target in hyperspectral remotely sensed image feature extracting method based on partitioning of matrix Isomap algorithm
CN110378272B (en) * 2019-07-12 2022-09-23 河海大学 Hyperspectral remote sensing image feature extraction method based on matrix blocking Isomap algorithm
CN110490268A (en) * 2019-08-26 2019-11-22 山东浪潮人工智能研究院有限公司 A kind of feature matching method of the improvement nearest neighbor distance ratio based on cosine similarity
CN110619370A (en) * 2019-09-23 2019-12-27 云南电网有限责任公司电力科学研究院 Hyperspectral image super-pixel local linear embedding dimension reduction method
CN111178160A (en) * 2019-12-11 2020-05-19 广州地理研究所 Method and device for determining urban ground feature coverage information
CN112329654A (en) * 2020-11-10 2021-02-05 中国地震局地震预测研究所 Hyperspectral image data classification method and system based on multi-manifold learning algorithm
CN112766227A (en) * 2021-02-04 2021-05-07 中国地质调查局武汉地质调查中心 Hyperspectral remote sensing image classification method, device, equipment and storage medium
CN112766227B (en) * 2021-02-04 2023-11-03 中国地质调查局武汉地质调查中心 Hyperspectral remote sensing image classification method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN103729651A (en) Hyperspectral remote sensing image classification method based on manifold neighbor measurement through local spectral angles
Qin et al. Individual tree segmentation and tree species classification in subtropical broadleaf forests using UAV-based LiDAR, hyperspectral, and ultrahigh-resolution RGB data
CN106295124B (en) The method of a variety of image detecting technique comprehensive analysis gene subgraph likelihood probability amounts
CN103077512B (en) Based on the feature extracting and matching method of the digital picture that major component is analysed
CN103971123B (en) Hyperspectral image classification method based on linear regression Fisher discrimination dictionary learning (LRFDDL)
Li et al. CBANet: An end-to-end cross-band 2-D attention network for hyperspectral change detection in remote sensing
CN107992891B (en) Multispectral remote sensing image change detection method based on spectral vector analysis
CN104376330B (en) Polarimetric SAR Image Ship Target Detection method based on super-pixel scattering mechanism
CN106503739A (en) The target in hyperspectral remotely sensed image svm classifier method and system of combined spectral and textural characteristics
Li et al. An automatic method for selecting the parameter of the RBF kernel function to support vector machines
CN103886336B (en) Polarized SAR image classifying method based on sparse automatic encoder
CN105930772A (en) City impervious surface extraction method based on fusion of SAR image and optical remote sensing image
CN103729652B (en) The Hyperspectral Remote Sensing Imagery Classification method embedded based on sparse holding manifold
CN106778680B (en) A kind of hyperspectral image band selection method and device based on critical bands extraction
CN102393913B (en) A kind of Weak target precise tracking method based on spectral fingerprint feature
CN102622607A (en) Remote sensing image classification method based on multi-feature fusion
CN103679192A (en) Image scene type discrimination method based on covariance features
Dong et al. A pixel cluster CNN and spectral-spatial fusion algorithm for hyperspectral image classification with small-size training samples
CN105894030B (en) High-resolution remote sensing image scene classification method based on layering multiple features fusion
CN113516052B (en) Imaging millimeter wave radar point cloud target classification method based on machine learning
Zhang et al. Mapping freshwater marsh species in the wetlands of Lake Okeechobee using very high-resolution aerial photography and lidar data
CN106485239A (en) One kind is using one-class support vector machines detection river mesh calibration method
CN110276746A (en) A kind of robustness method for detecting change of remote sensing image
Qi et al. Global–local 3-D convolutional transformer network for hyperspectral image classification
CN110363236A (en) The high spectrum image extreme learning machine clustering method of sky spectrum joint hypergraph insertion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20140416