CN104636753A - Region characteristic extraction method based on PCNN (Pulse Coupled Neural Network) neuron activation rate and cluster dispersion - Google Patents

Region characteristic extraction method based on PCNN (Pulse Coupled Neural Network) neuron activation rate and cluster dispersion Download PDF

Info

Publication number
CN104636753A
CN104636753A CN201510056549.7A CN201510056549A CN104636753A CN 104636753 A CN104636753 A CN 104636753A CN 201510056549 A CN201510056549 A CN 201510056549A CN 104636753 A CN104636753 A CN 104636753A
Authority
CN
China
Prior art keywords
neuron
group
gray
activated
pcnn
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510056549.7A
Other languages
Chinese (zh)
Other versions
CN104636753B (en
Inventor
卞红雨
李曙光
张志刚
张健
陈奕名
韩冷
刘珈麟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Engineering University
Original Assignee
Harbin Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Engineering University filed Critical Harbin Engineering University
Priority to CN201510056549.7A priority Critical patent/CN104636753B/en
Publication of CN104636753A publication Critical patent/CN104636753A/en
Application granted granted Critical
Publication of CN104636753B publication Critical patent/CN104636753B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a region characteristic extraction method based on PCNN (Pulse Coupled Neural Network) neuron activation rate and cluster dispersion. The method comprises the following steps of 1, pretreating an original image and enabling the PCNN to correspond to the image; 2, partitioning a gray interval of 0-255 into N gradually-reduced gray intervals according to a target region; 3, obtaining neurons which are subjected to cluster activation in the current gray interval; 4, counting the quantity of the activated neurons in the region of neurons subjected to cluster activation, and judging the neurons which are subjected to lifting activation; 5, counting the quantity of the neurons which are subjected to lifting activation, and obtaining the cluster activation rate and the group dispersion; and 6, reading the next gray interval and repeating the steps 3-6 till the Nth gray interval. The method has the advantages of low computation complexity and good classification effect.

Description

A kind of Region Feature Extraction method of Based PC NN neuronal activation rate and group's dispersion
Technical field
The invention belongs to a kind of feature extracting method of sonar image target area, particularly relate to a kind of target area pixel value distribution characteristics of can extracting, the Region Feature Extraction method of Based PC NN neuronal activation rate and group's dispersion.
Background technology
Along with the development of science and technology, it is too far behind to catch up with the mankind to be that the mammal of representative remains machine for the recognition capability of image, therefore a large amount of researcher both at home and abroad constantly explores the visual system with simulated animal, although the research of neural network experienced by one and another low tide, the step of scientists never stops.Along with the proposition of third generation neural network PCNN, research for neural network enters a new chapter, Lindblad and Kinser in 2005 has set forth the artificial neural network based on mammalian visual cortical neuron in the second edition of its PCNN research treatise, people are a a progressive step in machine vision research, how have had again a lot of new achievement to the research aspect that image processes for animal brain.Such as Johnson proposes to utilize PCNN network two dimensional image to be converted to the thought of one dimension pulse train, and some researchers propose with PCNN image layered method.Document 1:Bo Yu; Liming Zhang.Pulse-Coupled Neural Networks for Contour and Motion Matchings.2004 IEEE TRANSACTIONS ON NEURAL NETWORKS, p1186-1201, the chain type conduction Activiation method of 2004, this section of document utilization PCNN carries out match cognization to the edge of Ideal graph.Document 2: Liu's Qing, Ma Yide. based on the PCNN images steganalysis new method of histogram vectors center of gravity. application of electronic technology, No.10,2006.Document 3: Liu's Qing, Xu Luping, Ma Yide, Zhang Hua. in conjunction with the PCNN Small object image new detecting method of gray level entropy conversion. Beijing Institute of Technology's journal, Vol.29, No.12,2009.When these two sections of document utilizations are composed, matrix carrys out Description Image feature for identifying.
Generally speaking, the paper utilizing PCNN to carry out image recognition can be divided into two large classes: a class builds more complicated PCNN structure, and the complicacy increasing network is innovated as one; The neuron number that Equations of The Second Kind, by statistics PCNN, group's activation occurs is analyzed, and carries out segmentation, the denoising of image with this.These ways are deposited not enough both ways, and one is have ignored PCNN neighborhood neuron to the dynamic effects of Master neuron, does not embody the main and auxiliary neuronic dynamic movement of neighborhood; Ignore PCNN many neurons there is previously activated neuron characteristic distributions simultaneously on the other hand, does not utilize the neuronic distribution character of activated in advance.
Summary of the invention
The object of this invention is to provide a kind of computation complexity little, good classification effect, the Region Feature Extraction method of Based PC NN neuronal activation rate and group's dispersion.
The present invention is achieved by the following technical solutions:
A Region Feature Extraction method for Based PC NN neuronal activation rate and group's dispersion, comprises following step:
Step one: gather original image, pre-service is carried out to original image, be partitioned into the profile of target area, neural network PCNN is corresponding with image, by corresponding with the pixel of image for central nervous unit, the neighborhood of central nervous unit is corresponding with neighborhood territory pixel point, the neuronic gray-scale value being input as pixel;
Step 2: by arranging the neuronic threshold parameter of PCNN, be divided between N number of gray area by the tonal range of 0 ~ 255 according to target area, arranges between gray area according to gray-scale value order from big to small;
Step 3: read between a current kth gray area, obtains occurring in current gray level interval the neuron that group activates, and pixel value corresponding to neuron that group activates occurs in the interval corresponding tonal range of current gray level;
Step 4: statistics group occurs and activates excited target neuron number λ in neuronic field, recalculate and the value that group activates the neuronic internal activity item of excited target in neuronic field occurs, if the value of the neuronic internal activity item of excited target is positioned at the interval corresponding tonal range of current gray level, then current excited target neuron is activated in advance neuron, records the coordinate of activated in advance neuron corresponding pixel points;
Step 5: statistics activated in advance neuron number λ ', obtains group activity ratio and group's dispersion:
Wherein x, y are the coordinates of activated in advance neuron corresponding pixel points between each gray area,
Step 6: make the pixel point value that the generation group in current gray level interval activates and activated in advance neuron is corresponding be 0, make k=k+1, repeats step 3 to step 6, until k=N.
Beneficial effect:
The refinement of the present invention neuron exciting phenomena of PCNN, success is activated generation group and the previously activated neuron of excited target is separated, the neuronic activity ratio of activated in advance is utilized to characterize the grey value profile characteristic of target area on the one hand, add up pixel coordinate corresponding to activated in advance neuron between each gray area on the other hand, the pixel in target area with similar activation characteristic is connected analysis.
The present invention utilizes the neuronic distributed architecture of PCNN neighborhood, flexibly target area is divided into pixel region among a small circle, characterizes the pixel value distribution characteristics of these pixel regions among a small circle with previously activated neuron; The region with similar activating property is considered jointly, take into account the pixel value gamma characteristic of neighborhood gradient characteristics and more large regions scope, PCNN activated in advance neuron distribution discreteness in a wider context can be added up, therefore describe the feature of target area at microcosmic and macroscopic view two kinds of different angles.Compared to other texture characteristic extracting method, when utilizing context of methods extraction frogman and fish target signature for classifying, computation complexity is less, classifying quality is better.
Accompanying drawing explanation
Fig. 1 is the corresponding method of PCNN neuron and sonar image;
Fig. 2 is the gray level image of target region;
Fig. 3 is the target image be partitioned into;
Fig. 4 is the pixel activated first time, and respective pixel is worth between maximum gray area;
Fig. 5 is the pixel region affected by activation neuron;
Fig. 6 is the pixel that activated in advance phenomenon occurs in affected pixel region;
Fig. 7 is the suppressed region no longer occurring to activate afterwards;
Fig. 8 is that un-activation neuron Q affects by having activated neuron P in neighborhood, and previously activated schematic diagram occurs;
Fig. 9 activates neuron statistical graph for group;
Figure 10 is group's activity ratio statistical graph;
Figure 11 is group's dispersion statistical graph;
Figure 12 is the sonar image of fish target;
Figure 13 is the sonar image of frogman's target;
Figure 14 is the bulk treatment process flow diagram of the inventive method.
Embodiment
Below in conjunction with accompanying drawing, the present invention is described in further details.
The present invention is achieved in that
1. pair original image carries out pre-service, splits, calibrates target area, and PCNN is corresponding with sonar image, and central nervous unit is corresponding with the pixel of image, the gray-scale value of neuronic input and pixel, and the neighborhood of central nervous unit is corresponding with neighborhood territory pixel point;
2. PCNN neuron threshold value correlation parameter is set, is divided into N number of interval by autotelic for the gray level of 0 ~ 255;
3. the grey level range divided in statistic procedure 2, according to from the relatively large interval of gray-scale value to the interval statistics (between each statistics gray area) that gray-scale value is relatively little, judges whether the neuron in current gray level interval group occurs and activates phenomenon;
4., according to the neuronic character of PCNN, activated neuron and un-activation neuron in neighborhood can be encouraged to activate.Therefore influenced neuronic U value will be recalculated;
5. judge whether affected un-activation neuron again meets activation condition and activate (this is a kind of activated in advance phenomenon).If met, the neuron generation activated in advance that this pixel is corresponding, records coordinate and the quantity of previously activated neuron corresponding pixel points;
6. make the pixel point value that all generation groups activate and the previously activated neuron of excited target is corresponding be 0, no longer activate;
7. get back to step 3, add up the group's activating property between lower picture gray area, according to the previously activated character of order statistics neuron from step 3 to step 6, until all intervals of gray-scale value in 0 ~ 255 scope all add up complete;
8. add up neuron number and previously activated neuron number that group's activation occurs between each gray area, obtain group activity ratio.Add up the pixel coordinate that activated in advance neuron between each gray area is corresponding, obtain group dispersion.Using group's activity ratio and group's dispersion as the eigenwert describing provincial characteristics.
Implementation process of the present invention is described below, mainly in conjunction with Figure 14, the present invention is described in more detail:
The fundamental formular of PCNN is as follows:
F ij [ n ] = F ij [ n - 1 ] exp ( - t α F ) + V F Σ kl M ijkl Y kl [ n - 1 ] - - - ( 1 )
L ij [ n ] = L ij [ n - 1 ] exp ( - t α L ) + V L Σ kl M ijkl Y kl [ n - 1 ] - - - ( 2 )
U ij[n]=F ij[n](1+βL ij[n]) (3)
T ij[n]=exp(-tα T)T ij[n-1]+V TY ij[n-1] (4)
Y ij [ n ] = step ( U ij [ n ] - T ij [ n - 1 ] ) = 1 , U ij [ n ] &GreaterEqual; T ij [ n ] 0 , U ij [ n ] < T ij [ n ] - - - ( 5 )
In above formula:
F ij[n] represents neuron primary input.
S ijrepresent the image pixel gray level value that primary input central point is corresponding.
L ij[n] represents auxiliary input, and its value represents the auxiliary input of primary input neighborhood.
U ij[n] is internal activity item.
T ij[n] is dynamic threshold.
M ijkland W ijklbe neuronic link weight matrix, represent neuronic and experience the visual field.
Y kl[n-1] is the pulse Output rusults of pulse-coupled neural networks, and value is 1 (activation) and 0 (un-activation).
Ij represents the transverse and longitudinal coordinate of primary input region inconocenter vegetarian refreshments.Kl represents the transverse and longitudinal coordinate of auxiliary neighborhood territory pixel point.
V f, V land V tbe the affecting parameters of neighborhood, β is the interaction strength between primary input F road and auxiliary input L road.
N=1,2 ..., n is the multiple in minimum sampling period, represents which time dynamic change current formula is in, hereinafter referred to as iterations.
Below in conjunction with accompanying drawing, implementation process of the present invention is described in detail:
1. as shown in Figure 1, PCNN is corresponding with sonar image, namely central nervous unit is corresponding with the pixel of image, and neighborhood neuron is corresponding with neighborhood territory pixel point, the gray-scale value of neuron input and pixel.Carry out pre-service to original image, fundamental purpose is smoothing denoising, is partitioned into target area, obtains Fig. 2, and PCNN corresponds to the image after process; Here can split objective contour, by the gray-scale value zero setting of background, eliminate the impact of ground unrest, obtain Fig. 3.
2. by PCNN fundamental formular abbreviation:
F when iteration starts ij[n] equals PCNN central nervous unit corresponding grey scale value, i.e. F ij[n-1]=S ij, all neurons are all in holddown, i.e. Y ij[n-1]=0, now the activation characteristic of central nervous unit is by S ijwith attenuation parameter α fimpact, if the constant removed in formula (1) is biased S ij, formula (1) is deformed into:
F ij *[n]=S ijexp(-tα F) (6)
When judging whether un-activation neuron occurs previously activated first, the neuron that main consideration occurs to activate is on the impact of central nervous unit, and therefore formula (2) abbreviation is:
L ij [ n ] = V L &Sigma; kl W ijkl Y kl [ n - 1 ] - - - ( 7 )
The impact of PCNN neighborhood neuron on central nervous unit is a dynamic process, after the initial activation of neighborhood neuron, and Y ijits peripheral nerve unit of appreciable impact can there is activated in advance in the change of [n-1].Therefore formula (3) is deformed into two parts by the present invention:
Part I: U ij *[n]=F ij *[n] (8)
Part II: U ij[n]=U ij *[n]+β F ij *[n] L ij[n] (9)
Judge whether neuron group occurs when activating, and all neurons are all in holddown, i.e. Y first ij[n-1]=0.Therefore formula (4) abbreviation is:
T ij[n]=exp(-tα T)T 0(10)
Wherein T 0for initial threshold.
PCNN fundamental formular (9) and formula (10) correlation parameter are set, close gray-scale value can be activated according to formula (5) in identical sampling instant simultaneously; The gray level of 0 ~ 255 be divided into several interval, between each gray area, comprise the neuron that can simultaneously activate;
3., according between the gray area divided in step 2, according to the order that gray-scale value corresponding between gray area is descending, between each Iterative statistical gray area, judge according to formula (5) the PCNN neuronal activation situation that sonar image pixel is corresponding.Figure 4 shows that first time iteration, namely the pixel that the neuron of group's activation is corresponding occurs in maximum gray area;
4., according to the neuronic exciting phenomena of PCNN, activated neuron and un-activation neuron in neighborhood can be encouraged to activate, as shown in Figure 8.Therefore formula (9) is utilized to recalculate influenced neuronic U ij[n] value, Figure 5 shows that the pixel that excited target neuron is corresponding;
5. judge whether affected un-activation neuron again meets activation condition and activate (this is a kind of activated in advance phenomenon), if met, the neuron generation activated in advance that this pixel is corresponding, record coordinate and the quantity of previously activated neuron corresponding pixel points, Figure 6 shows that the pixel that activated in advance neuron is corresponding;
6. make the pixel point value that all generation groups activate and the previously activated neuron of excited target is corresponding be 0, no longer activate, as shown in Figure 7;
7. return step 3 and enter next iteration, neuron, excited target neuron and the neuronic situation of activated in advance is activated according to the group between the lower gray area of the order statistics from step 3 to step 6, until all intervals of gray-scale value in 0 ~ 255 scope all add up complete, obtain the time pulse sequence figure shown in Fig. 9, in figure, horizontal ordinate is iterations, ordinate be between gray area corresponding to each iteration in there is the neuron number that group activates;
8. by successfully there is previously activated number λ ' in the statistics neuron number λ of excited target and these neurons in the present invention, obtains activating neuron to the neuronic group's activity ratio of neighborhood (hereinafter referred to as " group's activity ratio "), and it is defined as follows:
Add up group's activity ratio that activated in advance neuron between each gray area is corresponding, can obtain group's activity ratio vector as shown in Figure 10, in figure, horizontal ordinate is iterations, the group activity ratio of ordinate corresponding to each iteration between gray area.
In addition, the present invention proposes the concept (being called for short " group's dispersion " below) that a kind of PCNN " activates group's dispersion ", is used for describing previously activated neuron distribution dispersion spatially.The reason proposing it is: group's activity ratio characterizes the neuronic fillip of neighborhood and texture properties, but previously activated neuron is not embodied at wider spatial distribution characteristic, therefore the concept proposing " group's dispersion " characterizes this feature, and formula (12) is the definition of group dispersion:
Wherein x, y are the coordinates of activated in advance pixel between each gray area.
x 0 = 1 &lambda; &prime; &Sigma; i = 1 &lambda; &prime; x i , y 0 = 1 &lambda; &prime; &Sigma; i = 1 &lambda; &prime; y i - - - ( 13 )
Add up group's dispersion that activated in advance neuron between each gray area is corresponding, obtain the group's dispersion vector shown in Figure 11, in figure, horizontal ordinate is iterations, the group dispersion of ordinate corresponding to each iteration between gray area.The concept of group's dispersion compensate for the neighborhood limitation of activity ratio sign, embodies and activates the spatial distribution characteristic of neuron in view picture target zone.
Application context of methods, can classify the sonar image (and other image) shown in Figure 12 and Figure 13.Method is: carried out mating (matching process of group's dispersion vector is identical) by group's activity ratio vector of to be sorted two targets, matching degree is defined as follows:
&eta; = 1 N &Sigma; i = 1 N 1 x 1 i - x 2 i
X in formula 1iand x 2irepresent the value of the above-mentioned proper vector correspondence position of two targets respectively, N represents gray scale interval number.
Prove more than a matching degree order of magnitude higher than the matching degree of inhomogeneity target of the similar target of the present invention through test, there is good classifying quality.

Claims (1)

1. a Region Feature Extraction method for Based PC NN neuronal activation rate and group's dispersion, is characterized in that, comprises following step:
Step one: gather original image, pre-service is carried out to original image, be partitioned into the profile of target area, neural network PCNN is corresponding with image, by corresponding with the pixel of image for central nervous unit, the neighborhood of central nervous unit is corresponding with neighborhood territory pixel point, the neuronic gray-scale value being input as pixel;
Step 2: by arranging the neuronic threshold parameter of PCNN, be divided between N number of gray area by the tonal range of 0 ~ 255 according to target area, arranges between gray area according to gray-scale value order from big to small;
Step 3: read between a current kth gray area, obtains occurring in current gray level interval the neuron that group activates, and pixel value corresponding to neuron that group activates occurs in the interval corresponding tonal range of current gray level;
Step 4: statistics group occurs and activates excited target neuron number λ in neuronic field, recalculate and the value that group activates the neuronic internal activity item of excited target in neuronic field occurs, if the value of the neuronic internal activity item of excited target is positioned at the interval corresponding tonal range of current gray level, then current excited target neuron is activated in advance neuron, records the coordinate of activated in advance neuron corresponding pixel points;
Step 5: statistics activated in advance neuron number λ ', obtains group activity ratio and group's dispersion:
Wherein x, y are the coordinates of activated in advance neuron corresponding pixel points between each gray area,
Step 6: make the pixel point value that the generation group in current gray level interval activates and activated in advance neuron is corresponding be 0, make k=k+1, repeats step 3 to step 6, until k=N.
CN201510056549.7A 2015-02-04 2015-02-04 A kind of Region Feature Extraction method based on PCNN neuronal activations rate and group's dispersion Active CN104636753B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510056549.7A CN104636753B (en) 2015-02-04 2015-02-04 A kind of Region Feature Extraction method based on PCNN neuronal activations rate and group's dispersion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510056549.7A CN104636753B (en) 2015-02-04 2015-02-04 A kind of Region Feature Extraction method based on PCNN neuronal activations rate and group's dispersion

Publications (2)

Publication Number Publication Date
CN104636753A true CN104636753A (en) 2015-05-20
CN104636753B CN104636753B (en) 2017-11-21

Family

ID=53215481

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510056549.7A Active CN104636753B (en) 2015-02-04 2015-02-04 A kind of Region Feature Extraction method based on PCNN neuronal activations rate and group's dispersion

Country Status (1)

Country Link
CN (1) CN104636753B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108053391A (en) * 2017-11-22 2018-05-18 华中科技大学 A kind of method for identifying neuron reconstruction errors
CN108604203A (en) * 2016-02-24 2018-09-28 索尼公司 Signal processing apparatus, signal processing method and program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101840568A (en) * 2009-09-29 2010-09-22 天津大学 Neigh Shrink image denoising method based on PCNN (Pulse Coupled Neural Network) region segmentation
CN102306289A (en) * 2011-09-16 2012-01-04 兰州大学 Method for extracting iris features based on pulse couple neural network (PCNN)
CN103605811A (en) * 2013-12-10 2014-02-26 三峡大学 Texture image retrieval method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101840568A (en) * 2009-09-29 2010-09-22 天津大学 Neigh Shrink image denoising method based on PCNN (Pulse Coupled Neural Network) region segmentation
CN102306289A (en) * 2011-09-16 2012-01-04 兰州大学 Method for extracting iris features based on pulse couple neural network (PCNN)
CN103605811A (en) * 2013-12-10 2014-02-26 三峡大学 Texture image retrieval method and device

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ZHAOBIN WANG: "Review of pulse-coupled neural networks", 《IMAGE AND VISION COMPUTING》 *
侯扬: "带钢表面缺陷图像的一类特征提取及其降维方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
刘勍: "基于脉冲耦合神经网络的图像处理若干问题研究", 《中国博士学位论文全文数据库 信息科技辑》 *
宁晓菊: "一种基于神经网络提取图像特征的图像检索方法", 《西安邮电学院学报》 *
李双科: "一种基于PCNN的指纹特征提取算法", 《万方数据》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108604203A (en) * 2016-02-24 2018-09-28 索尼公司 Signal processing apparatus, signal processing method and program
CN108604203B (en) * 2016-02-24 2022-03-18 索尼公司 Signal processing apparatus, signal processing method, and storage medium
CN108053391A (en) * 2017-11-22 2018-05-18 华中科技大学 A kind of method for identifying neuron reconstruction errors
CN108053391B (en) * 2017-11-22 2020-06-23 华中科技大学 Method for identifying neuron reconstruction errors

Also Published As

Publication number Publication date
CN104636753B (en) 2017-11-21

Similar Documents

Publication Publication Date Title
CN107103338B (en) SAR target recognition method integrating convolution features and integrated ultralimit learning machine
CN108711141B (en) Motion blurred image blind restoration method using improved generation type countermeasure network
CN103778432B (en) Human being and vehicle classification method based on deep belief net
CN111523579B (en) Vehicle type recognition method and system based on improved deep learning
CN106169081A (en) A kind of image classification based on different illumination and processing method
CN105913081B (en) SAR image classification method based on improved PCAnet
CN112396587B (en) Method for detecting congestion degree in bus compartment based on collaborative training and density map
CN107248180B (en) fMRI natural image decoding method based on hidden state model
CN106096660A (en) Convolutional neural networks based on independent composition analysis algorithm
CN104636753A (en) Region characteristic extraction method based on PCNN (Pulse Coupled Neural Network) neuron activation rate and cluster dispersion
Zhou et al. Robust temporal smoothness in multi-task learning
CN107045624A (en) Electroencephalogram signal preprocessing and classifying method based on maximum weighted cluster
Mermillod et al. The coarse-to-fine hypothesis revisited: Evidence from neuro-computational modeling
CN110569727B (en) Transfer learning method combining intra-class distance and inter-class distance for motor imagery classification
Sang et al. Image quality assessment based on quaternion singular value decomposition
CN108364007A (en) A kind of brain function connection features extracting method based on brain structure connection constraints
CN109961085B (en) Method and device for establishing flight delay prediction model based on Bayesian estimation
Du et al. Encoding and decoding target locations with waves in the turtle visual cortex
Fan et al. BFNet: Brain-like feedback network for object detection under severe weather
Li et al. An identification method of dangerous driving behavior in rush hour based on apriori algorithm.
CN115308705A (en) Multi-pose extremely narrow pulse echo generation method based on generation countermeasure network
CN112836669B (en) Driver distraction driving detection method
Xiang et al. An improved multiple imputation method based on chained equations for distributed photovoltaic systems
Dong et al. Clustering human wrist pulse signals via multiple criteria decision making
Hinrich et al. Variational Bayesian partially observed non-negative tensor factorization

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant