CN112927763B - Prediction method for odor descriptor rating based on electronic nose - Google Patents
Prediction method for odor descriptor rating based on electronic nose Download PDFInfo
- Publication number
- CN112927763B CN112927763B CN202110247208.3A CN202110247208A CN112927763B CN 112927763 B CN112927763 B CN 112927763B CN 202110247208 A CN202110247208 A CN 202110247208A CN 112927763 B CN112927763 B CN 112927763B
- Authority
- CN
- China
- Prior art keywords
- odor
- electronic nose
- des
- descriptor
- odor descriptor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 238000013136 deep learning model Methods 0.000 claims abstract description 12
- 239000012634 fragment Substances 0.000 claims abstract description 4
- 239000013598 vector Substances 0.000 claims description 15
- 230000006870 function Effects 0.000 claims description 9
- 230000002123 temporal effect Effects 0.000 claims description 6
- 239000011159 matrix material Substances 0.000 claims description 4
- 238000007781 pre-processing Methods 0.000 claims description 4
- 230000004913 activation Effects 0.000 claims description 3
- 238000011160 research Methods 0.000 abstract description 3
- 235000019645 odor Nutrition 0.000 description 68
- 238000013527 convolutional neural network Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 238000012549 training Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical group [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000012854 evaluation process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16C—COMPUTATIONAL CHEMISTRY; CHEMOINFORMATICS; COMPUTATIONAL MATERIALS SCIENCE
- G16C20/00—Chemoinformatics, i.e. ICT specially adapted for the handling of physicochemical or structural data of chemical particles, elements, compounds or mixtures
- G16C20/30—Prediction of properties of chemical compounds, compositions or mixtures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/0004—Gaseous mixtures, e.g. polluted air
- G01N33/0009—General constructional details of gas analysers, e.g. portable test equipment
- G01N33/0062—General constructional details of gas analysers, e.g. portable test equipment concerning the measuring method or the display, e.g. intermittent measurement or digital display
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/0004—Gaseous mixtures, e.g. polluted air
- G01N33/0009—General constructional details of gas analysers, e.g. portable test equipment
- G01N33/0062—General constructional details of gas analysers, e.g. portable test equipment concerning the measuring method or the display, e.g. intermittent measurement or digital display
- G01N33/0068—General constructional details of gas analysers, e.g. portable test equipment concerning the measuring method or the display, e.g. intermittent measurement or digital display using a computer specifically programmed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16C—COMPUTATIONAL CHEMISTRY; CHEMOINFORMATICS; COMPUTATIONAL MATERIALS SCIENCE
- G16C20/00—Chemoinformatics, i.e. ICT specially adapted for the handling of physicochemical or structural data of chemical particles, elements, compounds or mixtures
- G16C20/70—Machine learning, data mining or chemometrics
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Crystallography & Structural Chemistry (AREA)
- Biochemistry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Combustion & Propulsion (AREA)
- Food Science & Technology (AREA)
- Medicinal Chemistry (AREA)
- Analytical Chemistry (AREA)
- Pathology (AREA)
- Immunology (AREA)
- Evolutionary Biology (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a prediction method of odor descriptor rating based on an electronic nose, which comprises the following steps: s1: acquiring an electronic nose signal, and carrying out slicing pretreatment on the electronic nose signal to obtain a plurality of electronic nose signal fragments; s2: constructing a plurality of deep learning models of the same level, respectively inputting a plurality of electronic nose signal segments into each deep learning model of the same level, and respectively extracting the time characteristics and the space characteristics of each electronic nose signal segment; s3: and predicting the grade of the odor descriptor corresponding to the odor sample according to the time characteristic and the spatial characteristic of each electronic nose signal segment. The invention provides a prediction method for grading an odor descriptor based on an electronic nose, which solves the problem that the prediction accuracy is not high enough due to the fact that the existing research does not consider the space-time correlation among electronic nose signals.
Description
Technical Field
The invention relates to the technical field of electronic nose detection, in particular to a prediction method for grading an odor descriptor based on an electronic nose.
Background
The smell descriptor expresses olfactory perception by using olfactory related words. Olfactory studies aim to link the physical and chemical properties of odors with odor descriptors, whose rank has been proven to be related to chemical properties, such as carbon chain length, molecular size. Additional e-nose and mass spectrometer acquisition data have also been used to predict odor descriptors. To some extent, the electronic nose can simplify the evaluation process of the odor description rating.
At present, a random forest model, a regularization linear model, a back propagation neural network (BP), a Convolutional Neural Network (CNN) and the like are applied to electronic nose signal processing, but an electronic nose signal not only contains spatial information (correlation among sensors) but also contains time information (sensor response changes along with time), and the current research does not consider the time-space correlation among the electronic nose signals, so that the prediction accuracy is not high enough.
In the prior art, for example, in a Chinese patent 2020-01-24, an electronic nose prediction method based on a double-layer integrated neural network is disclosed as CN110726813A, and the characteristics of automatic extraction of abstract parts in a data set by a convolutional neural network, strong fitting capability, capability of effectively improving generalization capability and stability of a prediction model by an integrated algorithm and the like are combined, so that the detection performance of an electronic nose is improved, but the time-space correlation among electronic nose signals is not considered.
Disclosure of Invention
The invention provides a prediction method based on the odor descriptor rating of an electronic nose, aiming at overcoming the technical defect that the prediction accuracy is not high enough due to the fact that the existing research does not consider the space-time correlation between signals of the electronic nose.
In order to solve the technical problems, the technical scheme of the invention is as follows:
a prediction method for an electronic nose based scent descriptor rating, comprising the steps of:
s1: acquiring an electronic nose signal, and carrying out slicing pretreatment on the electronic nose signal to obtain a plurality of electronic nose signal fragments;
s2: constructing a plurality of deep learning models at the same level, respectively inputting a plurality of electronic nose signal segments into each deep learning model at the same level, and respectively extracting the time characteristics and the space characteristics of each electronic nose signal segment;
s3: and predicting the grade of the odor descriptor corresponding to the odor sample according to the time characteristic and the spatial characteristic of each electronic nose signal segment.
Preferably, in step S1, it is assumed that the electronic nose signal is in the time dimensionThe length of the electronic nose signal is m, and the electronic nose signal is sliced and preprocessed according to the time dimension, specifically:
dividing m into g time intervals of length l, i.e., { [1,2, \8230 { [1,2 } l],[l+1,…,2l],…,[m-l+1,…,m]Each time interval corresponding to
Therefore, the electronic nose signal is cut into an electronic nose signal setWherein the ith electronic nose signal segment->l=m/g。
Preferably, the sibling deep learning model comprises a ConvLSTM-based depth feature extractor and a fully connected classifier; wherein the ConvLSTM-based depth feature extractor comprises a CNN network and an LSTM network.
Preferably, in step S2, spatial features of the electronic nose signal segment are extracted through the CNN network, and temporal features of the electronic nose signal segment are extracted through the LSTM network.
Preferably, the output H of the ConvLSTM-based depth feature extractor is calculated by the following formula t :
H t =o t ⊙tanh(C t )
o t =σ(W Vo *V t +W ho *H t-1 +W co ⊙C t +b o )
C t =f t ⊙c t-1 +i t ⊙g t
i t =σ(W Vi *V t +W hi *H t-1 +W ci ⊙C t-1 +b i )
f t =σ(W Vf *V t +W hf *H t-1 +W cf ⊙C t-1 +b f )
g t =tanh(W Vc *V t +W hc *H t-1 +b c )
Wherein ` denotes a convolution operator `, ` denotes a Hadamard product, ` i ` t 、f t And o t Method for calculating an input gate, a forgetting gate and an output gate, each representing a time t, C t Representing the calculation of the memory cell at time t, σ being the activation function, W Vo Is input V t To the output gate o t Weight of (V) t For input at time t, W ho Is H t-1 To o t Weight of (C), H t-1 Hidden state at time t-1, W co Is C t To o t Weight of (a), b o Is o t Bias of c t-1 Memory cell at time t-1, W Vc Is a V t To C t Weight of (1), W hc Is H t-1 To C t Weight of (b), b c Is C t Offset of (2), W Vi Is a V t To i t Weight of (1), W hi Is H t-1 To i t Weight of (1), W ci Is C t-1 To i t Weight of (a), b i Is i t Offset of (2), W Vf Is a V t To f t Weight of (1), W hf Is H t-1 To f t Weight of (1), W cf Is C t-1 To f t Weight of (a), b f Is f t Is used to control the bias of (1).
Preferably, before predicting the rating of the odor descriptor, the method further comprises: according to the odor descriptor des i And odor descriptor des j Similarity of (2) will be the odor descriptor des i Description of the smellSymbol des j And (4) carrying out combination.
Preferably, the odor descriptor des is calculated by the following steps i And odor descriptor des j Similarity of (2):
all odor descriptor ratings for n odor samples are represented as a 2d matrix:
wherein each odor sample has k odor descriptors { des 1 ,…,des k Each odor descriptor des i All correspond to an odor descriptor level y i ;
dr i Is the ith column vector in Y:
dr i =[y 1i ,y 2i ,…,y ni ] T
wherein y is ni Is the grade of the ith odor descriptor, odor descriptor des, of the odor sample n i And odor descriptor des j The similarity between them is:
wherein m is i Is the vector dr i Average value of (1), m j Is the vector dr j Is a bivariate function for calculating the similarity between two input vectors, sim i,j The larger the value of (A), the odor descriptor des i And odor descriptor des j The more similar.
Preferably, the pearson correlation coefficient is taken as a bivariate function S (·,).
Preferably, the odor descriptor des is obtained by the following steps i And odor descriptor des j The combination of (a):
for each odor descriptor des i There is a scent descriptor that is most similar to it:
i * =argmax j!=i S(dr i ,dr j )
presetting a threshold value alpha if S (dr) i ,dr j )>α, then the odor descriptor des i And odor descriptor des j Is Cb = (des) i ,des i* ) (ii) a Otherwise, the odor descriptor des is not processed i And odor descriptor des j And (4) combining.
Preferably, the preset threshold α =0.6.
Compared with the prior art, the technical scheme of the invention has the beneficial effects that:
the invention provides a prediction method for grading an odor descriptor based on an electronic nose, which predicts the grading of the odor descriptor by simultaneously extracting the time characteristic and the space characteristic of an electronic nose signal, thereby considering the space-time correlation between the electronic nose signals and improving the prediction accuracy.
Drawings
FIG. 1 is a flow chart of the steps for implementing the technical solution of the present invention;
FIG. 2 is a schematic diagram of the present invention;
FIG. 3 is a schematic diagram of slice pre-processing of an electronic nose signal in accordance with the present invention;
FIG. 4 is a schematic diagram of the present invention for extracting the temporal and spatial features of each electronic nose signal segment;
fig. 5 is a schematic diagram of the combined output of the scent descriptors in the present invention.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the patent;
for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product;
it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The technical solution of the present invention is further described with reference to the drawings and the embodiments.
Example 1
As shown in fig. 1-2, a prediction method based on scent descriptor rating of electronic nose includes the following steps:
s1: acquiring an electronic nose signal, and carrying out slicing pretreatment on the electronic nose signal to obtain a plurality of electronic nose signal fragments;
s2: constructing a plurality of deep learning models of the same level, respectively inputting a plurality of electronic nose signal segments into each deep learning model of the same level, and respectively extracting the time characteristics and the space characteristics of each electronic nose signal segment;
s3: and predicting the grade of the odor descriptor corresponding to the odor sample according to the time characteristic and the spatial characteristic of each electronic nose signal segment.
Example 2
More specifically, as shown in FIG. 3, in step S1, it is assumed that the electronic nose signal is in the time dimensionThe length of the electronic nose signal is m, and the electronic nose signal is sliced and preprocessed according to the time dimension, and the slicing preprocessing specifically comprises the following steps:
dividing m into g time intervals with length l, namely { [1,2, \8230 { [1,2 } l],[l+1,…,2l],…,[m-l+1,…,m]Each time interval corresponds to/>
Therefore, the electronic nose signal is cut into an electronic nose signal setWherein the ith electronic nose signal segment->l=m/g。
In the specific implementation process, the electronic nose signal is converted into the two-dimensional frame sequence by carrying out slice preprocessing, so that the electronic nose signal is suitable for convolution linear matrix transformation.
More specifically, the sibling deep learning model comprises a ConvLSTM-based depth feature extractor and a fully connected classifier; wherein the ConvLSTM-based depth feature extractor comprises a CNN network and an LSTM network. In practical implementation, the method further comprises training a deep learning model at the same level, and in the training process, enabling a ConvLSTM-based depth feature extractor to learn parameters of a neural network, wherein the input is an electronic nose signal and the label is the rating of the odor descriptor. Respectively training the ratings of different odor descriptors, namely performing multiple combined training and multiple independent training, wherein each training process is an end-to-end process; secondly, the electronic nose signals are converted into feature vectors according to the network parameters learned in the training process corresponding to the grades of different smell descriptors, and then the fully connected classifier is updated by the feature vectors.
More specifically, in step S2, spatial features of the electronic nose signal segment are extracted through the CNN network, and temporal features of the electronic nose signal segment are extracted through the LSTM network, so that the temporal and spatial features of the electronic nose signal are effectively extracted.
In the implementation, the observation of each slice is a 2D plot for electronic nasal signals. If a picture is divided into tiled, non-overlapping patches and the pixels within the picture are looked at as their measure, the problem that the electronic nose signal needs to deal with will naturally be transformed into a spatio-temporal sequence problem. Pre-processed electronic nose signal [ V ] 1 ,…,V g ]After being input into a ConvLSTM-based deep feature extractor, the spatial features are extracted through a CNN network, and the temporal features are extracted through an LSTM network.
More specifically, the output H of the ConvLSTM-based depth feature extractor is calculated by the following formula t :
H t =o t ⊙tanh(C t )
o t =σ(W Vo *V t +E ho *H t-1 +W co ⊙C t +b o )
C t =f t ⊙c t-1 +i t ⊙g t
i t =σ(W Vi *V t +W hi *H t-1 +W ci ⊙C t-1 +b i )
f t =σ(W Vf *V t +W hf *H t-1 +W cf ⊙C t-1 +b f )
g t =tanh(W Vc *V t +W hc *H t-1 +b c )
Wherein ` denotes a convolution operator `, ` denotes a Hadamard product, ` i ` t 、f t And o t Method for calculating an input gate, a forgetting gate and an output gate, each representing a time t, C t Representing the calculation of the memory cell at time t, σ being the activation function, W Vo Is input V t To the output gate o t Weight of (V) t For input at time t, W ho Is H t-1 To o t Weight of (1), H t-1 Hidden state at time t-1, W co Is C t To o t Weight of (a), b o Is o t Bias of c t-1 Memory cells at time t-1, W Vc Is a V t To C t Weight of (1), W hc Is H t-1 To C t Weight of (a), b c Is C t Offset of (2), W Vi Is a V t To i t Weight of (1), W hi Is H t-1 To i t Weight of (1), W ci Is C t-1 To i t Weight of (a), b i Is i t Offset of (2), W Vf Is a V t To f t Weight of (1), W hf Is H t-1 To f t Weight of (1), W cf Is C t-1 To f t Weight of (a), b f Is f t Is used to control the bias of (1).
In implementation, H of each ConvLSTM-based depth feature extractor t Hidden states of multiple ConvLSTM-based depth feature extractors corresponding to the output of the layer[H 1 ,...,H g ]The output of the model, which is a 3D tensor data, needs to be converted into vectors in line-priority order in order to match the input of the regressor (i.e., fully connected classifier), as shown in fig. 4.
More specifically, as shown in fig. 5, before predicting the rating of the odor descriptor, the method further includes: according to the odor descriptor des i And odor descriptor des j Similarity of the odor descriptors des i And odor descriptor des j And (4) combining.
In a specific implementation process, the accuracy of rating prediction can be effectively improved by combining the odor descriptors according to the similarity.
More specifically, the odor descriptor des is calculated by the following steps i And odor descriptor des j The similarity of (2):
all odor descriptor ratings for n odor samples are represented as a 2d matrix:
wherein each odor sample has k odor descriptors { des 1 ,…,des k Each odor descriptor des i All correspond to an odor descriptor level y i ;
dr i Is the ith column vector in Y:
dr i =[y 1i ,y 2i ,…,y ni ] T
wherein y is ni Is the rank of the ith odor descriptor, odor descriptor des, of odor sample n i And odor descriptor des j The similarity between them is:
wherein m is i Is the vector dr i Average value of (1), m j Is the vector dr j Average value of, S (,)Is a bivariate function for calculating the similarity between two input vectors, sim i,j The larger the value of (A), the odor descriptor des i And odor descriptor des j The more similar.
More specifically, the pearson correlation coefficient is taken as a bivariate function S (·, ·).
More specifically, the odor descriptor des is obtained by the following steps i And odor descriptor des j The combination of (A) and (B):
for each odor descriptor des i There is a scent descriptor that is most similar to it:
i * =argmax j!=i S(dr i ,dr j )
presetting a threshold value alpha if S (dr) i ,dr j )>α, then the odor descriptor des i And odor descriptor des j The combination of (A) and (B) is Cb = (des) i ,des i* ) (ii) a Otherwise, the odor descriptor des is not processed i And odor descriptor des j And (4) carrying out combination.
In a specific implementation, the process of combining the scent descriptors is as follows:
Input:kodor descriptor{des 1 ,…,des k }
k descriptor ratings of n odor samples Y
Output:the Combinations C={Cb 1 ,Cb 2 ,…,Cb p }
Initialization:C={},α=0.6
For des i in{des 1 ,des 2 ,…,des K }
Get dr i =[y 1i ,y 2i ,…,y ni ] T
Find i * =argmax j!=i S(dr i ,dr j )
if S(dr i ,dr i* )≥α
Construct combination(des i ,des i* ),definition 2.
Append(des i ,des i* )into C
ReturnC
more specifically, the preset threshold α =0.6.
It should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.
Claims (3)
1. A prediction method for an electronic nose based scent descriptor rating, comprising the steps of:
s1: acquiring an electronic nose signal, and carrying out slicing pretreatment on the electronic nose signal to obtain a plurality of electronic nose signal fragments;
in step S1, the electronic nose signal is assumed to be in the time dimensionThe length of the electronic nose signal is m, and the electronic nose signal is sliced and preprocessed according to the time dimension, and the slicing preprocessing specifically comprises the following steps:
dividing m into g time intervals of length l, i.e., { [1,2, \8230 { [1,2 } l],[l+1,…,2l],…,[m-l+1,…,m]Each time interval corresponds to
Therefore, the electronic nose signal is cut into an electronic nose signal setWherein the ith electronic nose signal segmentl=m/g;
S2: constructing a plurality of deep learning models at the same level, respectively inputting a plurality of electronic nose signal segments into each deep learning model at the same level, and respectively extracting the time characteristics and the space characteristics of each electronic nose signal segment;
the deep learning model at the same level comprises a ConvLSTM-based depth feature extractor and a full-connection classifier; the ConvLSTM-based depth feature extractor comprises a CNN network and an LSTM network;
in step S2, extracting the spatial characteristics of the electronic nose signal segment through the CNN network, and extracting the temporal characteristics of the electronic nose signal segment through the LSTM network;
calculating the output H of the ConvLSTM-based depth feature extractor by the following formula t :
H t =o t ⊙tanh(C t )
o t =σ(W Vo *V t +W ho *H t-1 +W co ⊙C t +b o )
C t =f t ⊙c t-1 +i t ⊙g t
i t =σ(W Vi *V t +W hi *H t-1 +W ci ⊙C t-1 +b i )
f t =σ(W Vf *V t +W hf *H t-1 +W cf ⊙C t-1 +b f )
g t =tanh(W Vc *V t +W hc *H t-1 +b c )
Wherein ` denotes a convolution operator `, ` denotes a Hadamard product, ` i ` t 、f t And o t Method for calculating an input gate, a forgetting gate and an output gate, each representing a time t, C t Is represented inMethod for calculating a memory cell at time t, σ being an activation function, W Vo Is input V t To the output gate o t Weight of (V) t For input at time t, W ho Is H t-1 To o t Weight of (1), H t-1 Hidden state at time t-1, W co Is C t To o t Weight of (a), b o Is o t Bias of c t-1 Memory cells at time t-1, W Vc Is a V t To C t Weight of (1), W hc Is H t-1 To C t Weight of (a), b c Is C t Offset of (2), W Vi Is a V t To i t Weight of (1), W hi Is H t-1 To i t Weight of (1), W ci Is C t-1 To i t Weight of (a), b i Is i t Offset of (2), W Vf Is a V t To f t Weight of (1), W hf Is H t-1 To f t Weight of (1), W cf Is C t-1 To f t Weight of (a), b f Is f t Bias of (3);
s3: predicting the grade of the odor descriptor corresponding to the odor sample according to the time characteristic and the spatial characteristic of each electronic nose signal segment;
before predicting the rating of the odor descriptor, the method further comprises the following steps: according to the odor descriptor des i And odor descriptor des j Similarity of (2) will be the odor descriptor des i And odor descriptor des j Combining;
calculating the odor descriptor des by the following steps i And odor descriptor des j Similarity of (2):
all odor descriptor ratings for n odor samples are represented as a 2d matrix:
wherein each odor sample has k odor descriptors { des 1 ,…,des k }, each odor descriptor des i All correspond to an odor descriptor level y i ;
dr i Is the ith column vector in Y:
dr i =[y 1i ,y 2i ,…,y ni ] T
wherein y is ni Is the rank of the ith odor descriptor, odor descriptor des, of odor sample n i And odor descriptor des j The similarity between them is:
wherein m is i Is a vector dr i Average value of (d), m j Is the vector dr j Is a bivariate function for calculating the similarity between two input vectors, sim i,j The larger the value of (A), the odor descriptor des i And odor descriptor des j The more similar;
obtaining the odor descriptor des by the following steps i And odor descriptor des j The combination of (A) and (B):
for each odor descriptor des i There is a scent descriptor that is most similar to it:
i * =argmax j!=i S(dr i ,dr j )
2. The prediction method for electronic-nose-based odor descriptor rating according to claim 1, wherein the pearson correlation coefficient is taken as a bivariate function S (·,).
3. The prediction method for electronic nose based odor descriptor rating according to claim 1, wherein the preset threshold α =0.6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110247208.3A CN112927763B (en) | 2021-03-05 | 2021-03-05 | Prediction method for odor descriptor rating based on electronic nose |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110247208.3A CN112927763B (en) | 2021-03-05 | 2021-03-05 | Prediction method for odor descriptor rating based on electronic nose |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112927763A CN112927763A (en) | 2021-06-08 |
CN112927763B true CN112927763B (en) | 2023-04-07 |
Family
ID=76171667
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110247208.3A Active CN112927763B (en) | 2021-03-05 | 2021-03-05 | Prediction method for odor descriptor rating based on electronic nose |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112927763B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110794090A (en) * | 2019-10-22 | 2020-02-14 | 天津大学 | Emotion electronic nose implementation method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105929113A (en) * | 2016-04-20 | 2016-09-07 | 重庆大学 | Electronic nose signal error adaptive-learning method with subspace projection |
CN109239207A (en) * | 2018-07-23 | 2019-01-18 | 中山大学 | Odor identification method, apparatus and electric nasus system based on electronic nose |
CN110411955A (en) * | 2019-07-15 | 2019-11-05 | 中山大学中山眼科中心 | A kind of artificial intelligence training system based on characterization of molecules predicting of substance color smell |
CN110726813A (en) * | 2019-10-12 | 2020-01-24 | 浙江大学 | Electronic nose prediction method based on double-layer integrated neural network |
CN111103325A (en) * | 2019-12-19 | 2020-05-05 | 南京益得冠电子科技有限公司 | Electronic nose signal drift compensation method based on integrated neural network learning |
CN111954812A (en) * | 2017-12-08 | 2020-11-17 | 耶达研究及发展有限公司 | Utilization of electronic nose-based odorant analysis |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8880448B2 (en) * | 2009-07-23 | 2014-11-04 | Yeda Research And Development Co. Ltd. | Predicting odor pleasantness with an electronic nose |
-
2021
- 2021-03-05 CN CN202110247208.3A patent/CN112927763B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105929113A (en) * | 2016-04-20 | 2016-09-07 | 重庆大学 | Electronic nose signal error adaptive-learning method with subspace projection |
CN111954812A (en) * | 2017-12-08 | 2020-11-17 | 耶达研究及发展有限公司 | Utilization of electronic nose-based odorant analysis |
CN109239207A (en) * | 2018-07-23 | 2019-01-18 | 中山大学 | Odor identification method, apparatus and electric nasus system based on electronic nose |
CN110411955A (en) * | 2019-07-15 | 2019-11-05 | 中山大学中山眼科中心 | A kind of artificial intelligence training system based on characterization of molecules predicting of substance color smell |
CN110726813A (en) * | 2019-10-12 | 2020-01-24 | 浙江大学 | Electronic nose prediction method based on double-layer integrated neural network |
CN111103325A (en) * | 2019-12-19 | 2020-05-05 | 南京益得冠电子科技有限公司 | Electronic nose signal drift compensation method based on integrated neural network learning |
Non-Patent Citations (1)
Title |
---|
基于电子鼻的气体等级鉴别信号处理方法;刘瑜 等;《信号处理》;20070430;第23卷(第2期);301-305 * |
Also Published As
Publication number | Publication date |
---|---|
CN112927763A (en) | 2021-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108133188B (en) | Behavior identification method based on motion history image and convolutional neural network | |
CN111626245B (en) | Human behavior identification method based on video key frame | |
CN109740057B (en) | Knowledge extraction-based enhanced neural network and information recommendation method | |
CN104700100A (en) | Feature extraction method for high spatial resolution remote sensing big data | |
CN109886102B (en) | Fall-down behavior time-space domain detection method based on depth image | |
CN112927763B (en) | Prediction method for odor descriptor rating based on electronic nose | |
CN106056135A (en) | Human body motion classification method based on compression perception | |
CN111368634A (en) | Human head detection method, system and storage medium based on neural network | |
CN117392604A (en) | Real-time information monitoring and management system and method for Internet of things | |
CN113283282A (en) | Weak supervision time sequence action detection method based on time domain semantic features | |
CN116434002A (en) | Smoke detection method, system, medium and equipment based on lightweight neural network | |
CN111008570A (en) | Video understanding method based on compression-excitation pseudo-three-dimensional network | |
CN106528679A (en) | Time series analysis method based on multilinear autoregression model | |
CN112541010B (en) | User gender prediction method based on logistic regression | |
CN109859244B (en) | Visual tracking method based on convolution sparse filtering | |
CN112163494A (en) | Video false face detection method and electronic device | |
CN108319935B (en) | Face group identification method based on region sparsity | |
CN116523711A (en) | Education supervision system and method based on artificial intelligence | |
CN113255789B (en) | Video quality evaluation method based on confrontation network and multi-tested electroencephalogram signals | |
CN112465054B (en) | FCN-based multivariate time series data classification method | |
CN111681748B (en) | Medical behavior action normalization evaluation method based on intelligent visual perception | |
CN114565785A (en) | Unsupervised video anomaly detection method based on three-branch twin network | |
CN110751673B (en) | Target tracking method based on ensemble learning | |
CN115374931A (en) | Deep neural network robustness enhancing method based on meta-countermeasure training | |
CN114463667A (en) | Small sample learning method based on video identification |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |