CN104537377B - A kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis - Google Patents

A kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis Download PDF

Info

Publication number
CN104537377B
CN104537377B CN201410791475.7A CN201410791475A CN104537377B CN 104537377 B CN104537377 B CN 104537377B CN 201410791475 A CN201410791475 A CN 201410791475A CN 104537377 B CN104537377 B CN 104537377B
Authority
CN
China
Prior art keywords
view data
mrow
matrix
formula
nuclear matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410791475.7A
Other languages
Chinese (zh)
Other versions
CN104537377A (en
Inventor
施俊
赵攀博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN201410791475.7A priority Critical patent/CN104537377B/en
Publication of CN104537377A publication Critical patent/CN104537377A/en
Application granted granted Critical
Publication of CN104537377B publication Critical patent/CN104537377B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis, its step are as follows:(1) view data is read in;(2) using Parzen windows estimation kernel function;(3) nuclear matrix by all view data of column count is set up;(4) characteristic value and characteristic vector of the correlation matrix of view data are calculated;(5) the Renyi entropys of view data are calculated;(6) characteristic vector of the correlation matrix of view data is mapped using two-dimentional nuclear entropy component analyzing method, realizes the dimensionality reduction of view data.This method utilizes two-dimension analysis method, directly carries out kernel mapping to the row or column of image, and the entropy estimated the nuclear matrix of view data is ranked up, and obtains the intrinsic dimension of the view data after dimensionality reduction, moreover it is possible to keep the spatial structural form of view data;This method, without two-dimensional image data is converted into a n dimensional vector n, when progress kernel mapping tries to achieve correlation matrix, reduces the complexity of calculating due to directly pressing the nuclear matrix gone or by column count view data.

Description

A kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis
Technical field
The present invention relates to a kind of two-dimentional nuclear entropy constituent analysis(KECA)View data dimension reduction method, belong to dimensional images number According to processing method and applied technical field, suitable for the theoretical research with application technology of dimensionality reduction of high dimensional image.
Background technology
In the applications such as recognition of face, digital identification, medical image recognition, due to the higher-dimension of view data, usually need First to carry out dimension-reduction treatment.View data is each grey scale pixel value represented with numerical value, and it can effectively represent the information of image, And the spatial structural form of view data can be retained, but the dimension of view data is higher and data volume is big, therefore how to have Effect obtains important information, and dimensionality reduction is carried out to view data, and reduces the complexity of calculating, turns into the pass of image real time transfer Key link.
Many methods are proposed currently for the dimensionality reduction of view data, the dimension reduction method of view data mainly has principal component Analysis method, core principle component analysis method, nuclear entropy component analyzing method, then, there is two-dimensional principal component analysis method.Principal component point Analysis method is a kind of classical image data converting method, and it is a kind of linear transformation method, and core principle component analysis is then main The nonlinear extensions of constituent analysis.Image data converting method using principal component analysis as representative, tries to achieve view data first Covariance matrix, and the characteristic value and characteristic vector of this covariance matrix are obtained, then corresponding to maximum several characteristic values Characteristic vector structure coordinate system, finally sample image data is projected on this coordinate system, obtains the view data after dimensionality reduction. Nuclear entropy constituent analysis(Kernel Entropy Component Analysis, KECA)Method is a kind of based on the new of information theory Image data converting method.This method, it regard the reference axis of the secondary Renyi entropy of original spatial image data as projection side To, this is different from traditional data conversion spectrum transform method, the feature space of nuclear entropy constituent analysis (KECA) method choice dimensionality reduction, View data after conversion has obvious angled arrangement attribute, so as to beneficial to further processing.But also exist as follows not Foot:When above-mentioned principal component analytical method, core principle component analysis method, nuclear entropy component analyzing method conversion dimensionality reduction, first by two dimension Picture element matrix is converted into one-dimensional characteristic vector, and this data transfer device is not tied effectively not only using the space of view data Structure information, and when subsequently calculating covariance or calculating the nuclear matrix of view data, add the complexity of calculating;Next to that Although the above-mentioned dimension reduction method based on two-dimensional principal component analysis make use of the space structure spatial information of view data, still, this Kind linear processing methods still have limitation in the application.
In summary, the problem of dimension reduction method of current image data is primarily present be:View data can not effectively be utilized Spatial structural form, and computation complexity is high.
The content of the invention
The purpose of the present invention is that the dimension reduction method for being directed to conventional images data can not be tied effectively using the space of view data Structure information, the deficiencies of computation complexity is high, propose a kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis.
The present invention technical solution be:A kind of view data dimensionality reduction side based on two-dimentional nuclear entropy constituent analysis of the present invention Method, in particular to a kind of space structure letter that data conversion directly is carried out to two-dimensional image data, view data can be remained Breath, improve the dimensionality reduction performance of two-dimensional image data.This method is mainly directly to carry out kernel mapping by the row or column of image, and The form of vector need not be converted images into, the characteristic value for the nuclear matrix for trying to achieve view data and characteristic vector are brought into entropy and estimated In meter, selective entropy composition is mapped, and realizes the dimensionality reduction of view data, so as to improve the computation complexity for reducing data conversion. A kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis of the present invention, its step are as follows:
(1) reads in view data;
(2) is using Parzen windows estimation kernel function;
(3) sets up the nuclear matrix by all view data of column count;
(4) calculates the characteristic value and characteristic vector of the correlation matrix of view data;
(5) calculates the Renyi entropys of view data;
(6) is mapped the characteristic vector of the correlation matrix of view data using two-dimentional nuclear entropy component analyzing method, Realize the dimensionality reduction of view data;
Wherein, kernel function is estimated using Parzen windows described in step (2), be designated as, wherein, quadratic Renyi entropy Expression formula:
(1)
In formula,It is M mN image data matrix;It is image data matrix's Probability density function;It is monotonic function, only need to analyzes the quadratic Renyi entropy for removing negative sign, it is represented by, in order to estimate, Parzen window density estimators are introduced, it estimates expression formula:
(2)
In formula,It is Parzen windows pairEstimated obtained estimate;M is all view data squares The number of battle array;I is the sequence number of M, and span arrives M for 1;It is the kernel function of Parzen windows estimation,It is The width of window function;
Wherein, the nuclear matrix set up by all view data of column count described in step (3), its computational methods are as follows:
First, kernel mapping is carried out by column vector to all view data, obtains nuclear matrix, be designated as, its matrix is:
(3)
In formula,It is M mThe matrix of n view data, subscript n are total columns of image data matrix;Subscript M is The total number of image data matrix;It is the n-component column vector of the data of M sub-pictures,It is M auxiliary image datas The nuclear matrix arranged by the M auxiliary image datas n-th obtained by the n-th row progress kernel mapping;
Then, the nuclear matrix of view dataWith the nuclear matrix of the view data obtained by its transpositionIt is multiplied, gained Product is that nuclear matrix is correlation matrix, is designated as
(4)
In formula,It is nuclear matrixTransposition obtained by view data nuclear matrix;Subscript T represents transposition.
Wherein, the characteristic value and characteristic vector of the correlation matrix of the calculating view data described in step (4), its computational methods It is as follows:
First, if the characteristic value of the correlation matrix of view dataWith the projection vector of the correlation matrix of view data, it is full The following relational expression of foot:
Or, (5)
Then, it is assumed that the related nuclear matrix of M view data, be designated as, its expression formula is:
(6)
In formula,It isNuclear matrix corresponding to view data,It is image data matrixIn m image The mean eigenvalue of the row vector of data;
If, then above-mentioned formula (5) is converted into following relationship:
(7)
Solved by above-mentioned relation formula (7), obtain the characteristic value of the related nuclear matrix of view dataWith corresponding image The characteristic vector of the related nuclear matrix of data, its expression formula is respectively:
(8)
(9)
In formula,It is the m-th characteristic value of the related nuclear matrix of view data;It is the m-th picture number of formula (7) According to related nuclear matrix characteristic vector;
If, then the characteristic vector of the related nuclear matrix of view data is obtained, its expression formula is:
(10)
In formula,It is the m-th characteristic vector of the related nuclear matrix of view data;
Wherein, the Renyi entropys of the calculating view data described in step (5), are designated asIts computational methods is as follows:
(11)
In formula,It is Parzen windows pairEstimate, i.e. with Parzen windows to original spatial image data two The estimate in the direction of the reference axis of secondary Renyi entropys,
Formula (2) is updated in formula (11), obtains the Parzen window estimates of quadratic Renyi entropy, it is estimated Expression formula:
(12)
In formula,WithA i-th of image data matrix and j-th of image data matrix is represented, by step (4) The characteristic value and characteristic vector of related nuclear matrix are brought into formula (12), you can are obtainedEquivalence formula:
(13)
In formula,It is the related nuclear matrix m of view data1 unit vector;It is the related nuclear matrix m of view dataThe transposition of 1 unit vector;M is the number of image data matrix;It is the related nuclear moment for the view data that E transposition obtain The characteristic vector of battle array;It is the transposition of the related nuclear matrix ith feature vector of view data;
Wherein, step (6) is entered using two-dimentional nuclear entropy component analyzing method to the characteristic vector of the correlation matrix of view data Row mapping, realizes the dimensionality reduction of view data, its is specific as follows:
(14)
First, according to the Renyi entropy for the view data being calculated in calculating formula (13), dropped by its entropy size Sequence sorts, and the Renyi entropy vector of d view data, is designated as before selection, its expression formula is:
(15)
Then, the entropy vector is mapped, obtains the nuclear matrix of view dataMap vector, be designated as;, the intrinsic dimension of the view data after dimensionality reduction is obtained using projective transformation, it is achieved thereby that the dimensionality reduction of view data.
This discovery compared with prior art the advantages of be:This method employs two-dimentional nuclear entropy component analyzing method, to figure Nuclear matrix conversion is carried out by row or by row as data, Renyi entropys are estimated with the nuclear matrix of view data, after having obtained dimensionality reduction The assertive evidence dimension of view data, realize the dimensionality reduction of view data.It has the following advantages that:
(1) this method utilizes two-dimension analysis method, directly kernel mapping is carried out to the row or column of image, to view data Nuclear matrix estimate entropy is ranked up, obtain the intrinsic dimension of the view data after dimensionality reduction, moreover it is possible to keep the sky of view data Between structural information;
(2) this method is due to directly pressing row or nuclear matrix by column count view data, without by two-dimensional image data A n dimensional vector n is converted into, when progress kernel mapping tries to achieve the correlation matrix of view data, reduces the complexity of calculating.
Brief description of the drawings
Fig. 1 is a kind of realization stream of the view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis of the present invention Journey;
Fig. 2 is the dimension reduction method and the ratio of the nicety of grading of the dimension reduction method of existing view data of view data of the present invention Compared with table.
Embodiment
In order to better illustrate a kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis of the present invention, Carry out analyzing dimensionality reduction and classify using the forehead image of two kinds of different expressions of FERET face databases.
A kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis of the present invention, implementation process figure such as Fig. 1 institutes Show, specific implementation step is as follows:
(1) reads in view data:FERET face database view data is read in, raw image data size is 8080, In the present embodiment, view data is cut into size as 6060 view data;
(2) is designated as using Parzen windows estimation kernel function, wherein, quadratic Renyi entropy expression formula:
(1)
In formula,It is 200 6060 image data matrix;It is image data matrix Probability density function, analysis remove the quadratic Renyi entropy after negative sign, it is represented by, in order to Estimation, Parzen window density estimators are introduced, it estimates expression formula:
(2)
In formula,It is Parzen windows pairEstimated obtained estimate;200 be all view data squares The number of battle array;It is the kernel function of Parzen windows estimation,It is the width of window function;
(3) sets up the nuclear matrix by all view data of column count, and its computational methods is as follows:
First, kernel mapping is carried out by column vector to all view data, obtains nuclear matrix, be designated as, its matrix is:
(3)
In formula,It is 200 6060 image data matrix, subscript 60 represent the matrix columns of view data;Subscript 200 represent the number of image data matrix;It is the 60th column vector of the data of the 200th image, column vector is 60 1;
Then, the nuclear matrix of view dataWith the image data matrix obtained by its transpositionIt is multiplied, the product of gained It is related nuclear matrix for nuclear matrix, is designated as
(4)
In formula,It is nuclear matrixTransposition obtained by view data nuclear matrix;Subscript T represents transposition;
(4) calculates the characteristic value and characteristic vector of the correlation matrix of view data, and its is specific as follows:
First, if the characteristic value of the correlation matrix of view dataWith the projection vector of the correlation matrix of view data, it is full The following relational expression of foot:
Or, (5)
Then, it is assumed that the related nuclear matrix of 200 view data, be designated as, its expression formula is:
(6)
In formula,It isNuclear matrix corresponding to view data,It is image data matrixIn m picture number According to row vector mean eigenvalue;
If, then above-mentioned formula (5) is converted into following relationship:
(7)
Solved by above-mentioned relation formula (7), obtain the characteristic value of the related nuclear matrix of view dataWith corresponding figure As the characteristic vector of the related nuclear matrix of data, its expression formula is respectively:
(8)
(9)
In formula,It is the 200th characteristic value of related nuclear matrix of view data;It is the 200th image of formula (7) The characteristic vector of the related nuclear matrix of data;
If, then the characteristic vector of the related nuclear matrix of view data is obtained, its expression formula is:
(10)
In formula,It is the m-th characteristic vector of the related nuclear matrix of view data;
(5) calculates the Renyi entropys of view data, is designated asIts computational methods is as follows:
(11)
In formula,It is Parzen windows pairEstimate, i.e. with Parzen windows to original spatial image data two The estimate in the direction of the reference axis of secondary Renyi entropys,
Formula (3) is updated in formula (11), obtains the Parzen window estimates of quadratic Renyi entropy, it is estimated Expression formula:
(12)
In formula,WithRepresent A i-th of image data matrix and j-th of image data matrix;By phase in step (4) Close the characteristic value of nuclear matrix and characteristic vector is brought into formula (12) and can obtainEquivalence formula:
(13)
In formula,It is the related nuclear matrix 60 of view data1 unit vector;It is the related nuclear matrix m of view data The transposition of 1 unit vector;
(6) is mapped the characteristic vector of the correlation matrix of view data using two-dimentional nuclear entropy component analyzing method, real The dimensionality reduction of existing view data, its is specific as follows:
(14)
First according to the Renyi entropy for the view data being calculated in calculating formula (13), dropped by its entropy size Sequence sorts, and the Renyi entropy vector of d view data, is designated as before selection, its expression formula is:
(15)
Then, the entropy vector is mapped, obtains the nuclear matrix of view dataMap vector, be designated as, the intrinsic dimension of the view data after dimensionality reduction is obtained using projective transformation, it is achieved thereby that the dimensionality reduction of view data.
In order to verify using a kind of view data dimension reduction method method based on two-dimentional nuclear entropy constituent analysis of the invention Effect, in an experiment, the dimension reduction method of the dimension reduction method of the present invention and nuclear entropy component analyzing method of the prior art is made into ratio Compared with as shown in Fig. 2 in the comparison sheet, often row represents the 10 different dimensions dropped to, at interval of 10 dimensions, is obtained final To data characteristics drop to respectively 100 to 10 dimension;Each column expression is compared analysis with three kinds of methods, is respectively:By row nuclear entropy Constituent analysis, by the constituent analysis of row nuclear entropy and nuclear entropy constituent analysis.It can be seen that from the table 1 in Fig. 2:Under same dimension, two dimension The result of nuclear entropy constituent analysis is substantially better than the result of nuclear entropy constituent analysis;Under same dimension, it is better than by row nuclear entropy constituent analysis By row nuclear entropy constituent analysis;Two-dimentional nuclear entropy constituent analysis has reached maximum when dimension is 60, and nuclear entropy constituent analysis is 100 Reach maximum during dimension.Two-dimentional nuclear entropy component analyzing method of the invention shown in the comparison sheet 1 of the nicety of grading is better than existing Nuclear entropy component analyzing method in technology.

Claims (1)

1. a kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis, it is characterised in that its step is as follows:
(1) reads in view data, and these view data are to remain spatial structural form, it is not necessary to turns two-dimensional pixel matrix Change the view data for being characterized vector into;
(2) is concretely comprised the following steps using Parzen windows estimation kernel function:
Described estimates kernel function using Parzen windows, is designated asWherein, quadratic Renyi entropy expression formula:
H (p)=- log ∫ p2(A)dA (1)
In formula, A is M m × n image data matrix;P (A) is image data matrix A=[A1... AM] probability density letter Number;H (p) is monotonic function, need to analyze the quadratic Renyi entropy V (p) for removing negative sign, and it is represented by V (p)=∫ p2(A), to estimate V (p) is counted, introduces Parzen window density estimators, it estimates expression formula:
<mrow> <mover> <mi>p</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mi>A</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mi>M</mi> </mfrac> <msubsup> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>M</mi> </msubsup> <msub> <mi>K</mi> <mi>&amp;sigma;</mi> </msub> <mrow> <mo>(</mo> <mi>A</mi> <mo>,</mo> <msub> <mi>A</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
In formula,It is the estimate that Parzen windows are estimated to obtain to p (A);M is of all image data matrixs Number;I is the sequence number of M, and span arrives M for 1;Kσ(A, Ai) it is the kernel function that Parzen windows are estimated, σ is the width of window function;
(3) sets up the nuclear matrix by all view data of column count, concretely comprises the following steps:
First, kernel mapping is carried out by column vector to all view data, obtains nuclear matrix, be designated as Φ, its matrix is:
<mrow> <mi>&amp;Phi;</mi> <mo>=</mo> <mo>&amp;lsqb;</mo> <mi>&amp;phi;</mi> <mrow> <mo>(</mo> <msubsup> <mi>A</mi> <mn>1</mn> <mn>1</mn> </msubsup> <mo>)</mo> </mrow> <mo>,</mo> <mn>...</mn> <mo>,</mo> <mi>&amp;phi;</mi> <mrow> <mo>(</mo> <msubsup> <mi>A</mi> <mn>1</mn> <mi>n</mi> </msubsup> <mo>)</mo> </mrow> <mo>,</mo> <mn>...</mn> <mi>&amp;phi;</mi> <mrow> <mo>(</mo> <msubsup> <mi>A</mi> <mi>M</mi> <mn>1</mn> </msubsup> <mo>)</mo> </mrow> <mo>,</mo> <mn>...</mn> <mi>&amp;phi;</mi> <mrow> <mo>(</mo> <msubsup> <mi>A</mi> <mi>M</mi> <mi>n</mi> </msubsup> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
In formula, AMIt is the matrix of M m × n view data, subscript n is total columns of image data matrix;Subscript M is picture number According to the total number of matrix;It is the n-component column vector of the data of M sub-pictures,It is that M auxiliary image datas arrange by n-th The nuclear matrix that M auxiliary image datas n-th obtained by progress kernel mapping arrange;
Then, the nuclear matrix Φ of the view data and nuclear matrix Φ of the view data obtained by its transpositionTIt is multiplied, the product of gained For the related nuclear matrix of nuclear matrix, S is designated as:
S=Φ ΦT (4)
In formula, ΦTIt is the nuclear matrix of the view data obtained by nuclear matrix Φ transposition;Subscript T represents to turn;
(4) calculates the characteristic value and characteristic vector of the related nuclear matrix of the nuclear matrix of view data, concretely comprises the following steps:
First, it is full if the projection vector v of the eigenvalue λ of the related nuclear matrix of view data nuclear matrix related to view data The following relational expression of foot:
λ v=Sv or λ v=Φ ΦTV, λ >=0 (5)
Then, it is assumed that the related nuclear matrix of M view data, be designated asIts expression formula is:
<mrow> <mover> <mi>&amp;Phi;</mi> <mo>~</mo> </mover> <mo>=</mo> <mo>&amp;lsqb;</mo> <mi>&amp;phi;</mi> <mrow> <mo>(</mo> <msub> <mover> <mi>A</mi> <mo>&amp;OverBar;</mo> </mover> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mo>,</mo> <mn>...</mn> <mo>,</mo> <mi>&amp;phi;</mi> <mrow> <mo>(</mo> <msub> <mover> <mi>A</mi> <mo>&amp;OverBar;</mo> </mover> <mi>M</mi> </msub> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>6</mn> <mo>)</mo> </mrow> </mrow>
In formula,It isNuclear matrix corresponding to view data,It is image data matrix AMIn m view data The mean eigenvalue of row vector;
IfAbove-mentioned formula (5) is then converted into following relationship:
<mrow> <mi>&amp;lambda;</mi> <mover> <mi>&amp;Phi;</mi> <mo>~</mo> </mover> <mi>q</mi> <mo>=</mo> <msup> <mi>&amp;Phi;&amp;Phi;</mi> <mi>T</mi> </msup> <mover> <mi>&amp;Phi;</mi> <mo>~</mo> </mover> <mi>q</mi> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow>
Solved by above-mentioned relation formula (7), obtain the eigenvalue λ of the related nuclear matrix of view data and corresponding view data Related nuclear matrix characteristic vector q, its expression formula is respectively:
λ=[λ1..., λM] (8)
Q=[q1..., qM] (9)
In formula, λMIt is the m-th characteristic value of the nuclear matrix of view data;qMIt is the related core of the m-th view data of formula (7) The characteristic vector of matrix;
IfThe characteristic vector of the related nuclear matrix of view data is then obtained, its expression formula is:
V=[v1..., vM] (10)
In formula, vMIt is the m-th characteristic vector of the related nuclear matrix of view data;
(5) calculates the Renyi entropys of view data, concretely comprises the following steps:The Renyi entropys of described calculating view data, are designated asIts computational methods is as follows:
<mrow> <mover> <mi>V</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mi>p</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mi>M</mi> </mfrac> <msubsup> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>M</mi> </msubsup> <mover> <mi>p</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <msub> <mi>A</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>11</mn> <mo>)</mo> </mrow> </mrow>
In formula,It is estimate of the Parzen windows to V (p), i.e. secondary to original spatial image data with Parzen windows The estimate in the direction of the reference axis of Renyi entropys,
Formula (2) is updated to the Parzen window estimates that quadratic Renyi entropy is obtained in formula (11)It estimates expression formula:
<mrow> <mover> <mi>V</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mi>p</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msup> <mi>M</mi> <mn>2</mn> </msup> </mfrac> <msubsup> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>M</mi> </msubsup> <msubsup> <mi>&amp;Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>M</mi> </msubsup> <msub> <mi>K</mi> <mi>&amp;sigma;</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>A</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>A</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>12</mn> <mo>)</mo> </mrow> </mrow>
In formula, AiAnd AjA i-th of image data matrix and j-th of image data matrix is represented, by related nuclear moment in step (4) The characteristic value and characteristic vector of battle array are brought into formula (12) i.e. availableEquivalence formula:
Or
In formula, 1 is the unit vector of related nuclear matrix m × 1 of view data;1TIt is related nuclear matrix m × 1 of view data The transposition of unit vector;M is the number of image data matrix;ETIt is the spy of the related nuclear matrix for the view data that E transposition obtain Sign vector;It is the transposition of the related nuclear matrix ith feature vector of view data;
(6) is mapped the characteristic vector of the related nuclear matrix of view data using two-dimentional nuclear entropy component analyzing method, is realized The dimensionality reduction of view data, is concretely comprised the following steps:
Y=[v1..., vd]TΦ(A) (14)
First, according to the Renyi entropy for the view data being calculated in calculating formula (13), descending row is carried out by its entropy size Sequence, the Renyi entropy vector of d view data, is designated as Z, its expression formula is before selection:
Z=[z1..., zd] (15)
Then, the entropy vector is mapped, obtains the nuclear matrix Φ (A) of view data map vector, be designated as v1..., vd, the intrinsic dimension of the view data after dimensionality reduction is obtained using projective transformation, it is achieved thereby that the dimensionality reduction of view data.
CN201410791475.7A 2014-12-19 2014-12-19 A kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis Active CN104537377B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410791475.7A CN104537377B (en) 2014-12-19 2014-12-19 A kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410791475.7A CN104537377B (en) 2014-12-19 2014-12-19 A kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis

Publications (2)

Publication Number Publication Date
CN104537377A CN104537377A (en) 2015-04-22
CN104537377B true CN104537377B (en) 2018-03-06

Family

ID=52852897

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410791475.7A Active CN104537377B (en) 2014-12-19 2014-12-19 A kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis

Country Status (1)

Country Link
CN (1) CN104537377B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794505A (en) * 2015-04-28 2015-07-22 上海大学 Multichannel electroencephalogram data fusion and dimension descending method
CN106548203A (en) * 2016-10-21 2017-03-29 北京信息科技大学 A kind of fast automatic point of group of multiparameter flow cytometry data and gating method
CN115206551A (en) * 2022-08-10 2022-10-18 北京京东拓先科技有限公司 Health state monitoring method and device based on digital twins

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7318005B1 (en) * 2006-07-07 2008-01-08 Mitsubishi Electric Research Laboratories, Inc. Shift-invariant probabilistic latent component analysis
CN104198924A (en) * 2014-09-11 2014-12-10 合肥工业大学 Novel analog circuit early fault diagnosis method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1977393A4 (en) * 2006-01-18 2013-05-08 Technion Res & Dev Foundation System and method for dehazing

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7318005B1 (en) * 2006-07-07 2008-01-08 Mitsubishi Electric Research Laboratories, Inc. Shift-invariant probabilistic latent component analysis
CN104198924A (en) * 2014-09-11 2014-12-10 合肥工业大学 Novel analog circuit early fault diagnosis method

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Kernel Entropy Component Analysis;Robert Jenssen 等;《TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》;20100531;第32卷(第5期);第847-860页 *
Two-Dimensional PCA:A New Approach to Appearance-Based Face Representation and Recognition;Jian Yang 等;《TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》;20040131;第26卷(第1期);第131-137页 *
一种基于外观的人脸描述和识别的方法;李莉;《邢台学院学报》;20050630;第20卷(第2期);第107-109页 *
基于核熵成分分析的数据降维;黄丽瑾 等;《计算机工程》;20120131;第38卷(第2期);第175-177页 *
基于核熵成分分析的高光谱遥感图像分类算法;王瀛 等;《吉林大学学报(工学版)》;20121130;第42卷(第6期);第1597-1601页 *
顾及空间上下文关系的JointBoost点云分类及特征降维;郭波 等;《测绘学报》;20131031;第42卷(第5期);第715-721页 *

Also Published As

Publication number Publication date
CN104537377A (en) 2015-04-22

Similar Documents

Publication Publication Date Title
CN104751191B (en) A kind of Hyperspectral Image Classification method of sparse adaptive semi-supervised multiple manifold study
CN102629374B (en) Image super resolution (SR) reconstruction method based on subspace projection and neighborhood embedding
CN106557784A (en) Fast target recognition methodss and system based on compressed sensing
CN101556690A (en) Image super-resolution method based on overcomplete dictionary learning and sparse representation
CN105261000A (en) Hyperspectral image fusion method based on end member extraction and spectrum unmixing
CN110414600A (en) A kind of extraterrestrial target small sample recognition methods based on transfer learning
CN110111256A (en) Image Super-resolution Reconstruction method based on residual error distillation network
CN104008394B (en) Semi-supervision hyperspectral data dimension descending method based on largest neighbor boundary principle
CN114648684A (en) Lightweight double-branch convolutional neural network for image target detection and detection method thereof
CN103268484A (en) Design method of classifier for high-precision face recognitio
CN104537377B (en) A kind of view data dimension reduction method based on two-dimentional nuclear entropy constituent analysis
CN111860683A (en) Target detection method based on feature fusion
CN101609503B (en) Face super-resolution image processing method based on double-manifold alignment
Niu et al. Machine learning-based framework for saliency detection in distorted images
CN116843975A (en) Hyperspectral image classification method combined with spatial pyramid attention mechanism
CN115937693A (en) Road identification method and system based on remote sensing image
CN104299201B (en) Image reconstruction method based on heredity sparse optimization
Hu et al. Hyperspectral image super-resolution based on multiscale mixed attention network fusion
CN105975940A (en) Palm print image identification method based on sparse directional two-dimensional local discriminant projection
CN102129570B (en) Method for designing manifold based regularization based semi-supervised classifier for dynamic vision
CN105719323A (en) Hyperspectral dimension reducing method based on map optimizing theory
CN103440625B (en) The Hyperspectral imagery processing method strengthened based on textural characteristics
CN102289679B (en) Method for identifying super-resolution of face in fixed visual angle based on related characteristics and nonlinear mapping
CN108596831B (en) Super-resolution reconstruction method based on AdaBoost example regression
CN114627370A (en) Hyperspectral image classification method based on TRANSFORMER feature fusion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant