KR960012131A - Error Signal Generation Method for Efficient Learning of Multi-layer Perceptron Neural Networks - Google Patents
Error Signal Generation Method for Efficient Learning of Multi-layer Perceptron Neural Networks Download PDFInfo
- Publication number
- KR960012131A KR960012131A KR1019940025170A KR19940025170A KR960012131A KR 960012131 A KR960012131 A KR 960012131A KR 1019940025170 A KR1019940025170 A KR 1019940025170A KR 19940025170 A KR19940025170 A KR 19940025170A KR 960012131 A KR960012131 A KR 960012131A
- Authority
- KR
- South Korea
- Prior art keywords
- error signal
- generation method
- signal generation
- output
- neural networks
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract 4
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 title claims 2
- 238000013528 artificial neural network Methods 0.000 title abstract description 4
- 230000010365 information processing Effects 0.000 claims 1
- 238000003062 neural network model Methods 0.000 claims 1
- 210000002569 neuron Anatomy 0.000 claims 1
- 229920006395 saturated elastomer Polymers 0.000 claims 1
- 230000000946 synaptic effect Effects 0.000 claims 1
- 230000006870 function Effects 0.000 abstract 2
- 238000013459 approach Methods 0.000 abstract 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
- G06N3/063—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Computational Linguistics (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Neurology (AREA)
- Image Analysis (AREA)
- Feedback Control In General (AREA)
Abstract
다충퍼셉트론으 역전파 학습 알고리즘 성능을 개선하기 위하여, 새로운 오차함수를 제안한다. 제안된 오차함수는 출력충의 목표값이 출력값과의 차이가 많이 날수록 오차신호를 크게 발생시켜, 신경망의 학습과정에서 출력노드가 부적절하게 호화되는 현상을 줄여준다. 또, 출력층의 목표값이 출력값과 가까워지면 오차신호가 작게 발생하여 신경망이 학습패턴에 과도하게 학습되는 것을 막아준다.In order to improve the performance of the back-propagation learning algorithm with M. perceptron, a new error function is proposed. The proposed error function generates an error signal as the target value of the output buffer becomes more different from the output value, thereby reducing the improper luxury of the output node during the neural network learning process. In addition, when the target value of the output layer approaches the output value, an error signal is generated small, thereby preventing the neural network from being excessively trained in the learning pattern.
Description
본 내용은 요부공개 건이므로 전문내용을 수록하지 않았음As this is a public information case, the full text was not included.
제1도는 다충퍼셉트론 신경회로망의 구조,Figure 1 shows the structure of the multiworm perceptron neural network,
제4도는 효율적 학습을 위한 오차신오 제안도.4 is an error Shino suggestion for efficient learning.
Claims (1)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1019940025170A KR0141341B1 (en) | 1994-09-30 | 1994-09-30 | Error signal generation method for efficient learning of multilayer perceptron neural network |
JP6313861A JP2607351B2 (en) | 1994-09-30 | 1994-12-16 | Error Signal Generation Method for Efficient Learning of Multilayer Perceptron Neural Network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1019940025170A KR0141341B1 (en) | 1994-09-30 | 1994-09-30 | Error signal generation method for efficient learning of multilayer perceptron neural network |
Publications (2)
Publication Number | Publication Date |
---|---|
KR960012131A true KR960012131A (en) | 1996-04-20 |
KR0141341B1 KR0141341B1 (en) | 1998-07-01 |
Family
ID=19394262
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1019940025170A KR0141341B1 (en) | 1994-09-30 | 1994-09-30 | Error signal generation method for efficient learning of multilayer perceptron neural network |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2607351B2 (en) |
KR (1) | KR0141341B1 (en) |
-
1994
- 1994-09-30 KR KR1019940025170A patent/KR0141341B1/en not_active IP Right Cessation
- 1994-12-16 JP JP6313861A patent/JP2607351B2/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
JP2607351B2 (en) | 1997-05-07 |
JPH08115310A (en) | 1996-05-07 |
KR0141341B1 (en) | 1998-07-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Hagiwara | Novel backpropagation algorithm for reduction of hidden units and acceleration of convergence using artificial selection | |
Fahlman | The recurrent cascade-correlation architecture | |
Nauck et al. | Designing neuro-fuzzy systems through backpropagation | |
Suzuki et al. | Intelligent agent system for human-robot interaction through artificial emotion | |
Lin et al. | Adaptive fuzzy command acquisition with reinforcement learning | |
Touretzky | Connectionism and compositional semantics | |
Fletcher et al. | Combining prior symbolic knowledge and constructive neural network learning | |
Reyneri | Weighted radial basis functions for improved pattern recognition and signal processing | |
Shastri | Types and quantifiers in SHRUTI–a connectionist model of rapid reasoning and relational processing | |
Yu et al. | An adaptive activation function for multilayer feedforward neural networks | |
KR960012131A (en) | Error Signal Generation Method for Efficient Learning of Multi-layer Perceptron Neural Networks | |
Dyer | Lexical acquisition through symbol recirculation in distributed connectionist networks | |
Kasabov et al. | Hybrid intelligent adaptive systems: A framework and a case study on speech recognition | |
Peng et al. | Adaptive nonmonotone conjugate gradient training algorithm for recurrent neural networks | |
Hattori et al. | Quick learning for multidirectional associative memories | |
Erenshteyn et al. | Is designing a neural network application an art or a science? | |
Lara | Artificial neural networks: An introduction | |
Stenning | A Speech Based Connectionist Model of Human Short Term Memory | |
Rueckl | Connectionism and the Notion of Levels | |
Omori et al. | Incremental knowledge acquisition architecture that is driven by the emergence of inquiry conversation | |
Becerra et al. | Self pruning Gaussian Synapse Networks for behavior based robots | |
Rossi | Connectionism and the emergence of knowledge | |
Tadj et al. | On a Fuzzy DVQ Algorithm for Speech Recognition | |
Wang et al. | Research and design of a fuzzy neural expert system | |
Jou et al. | Parallel distributed processing with multiple one-output back-propagation neural networks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant | ||
FPAY | Annual fee payment |
Payment date: 20080307 Year of fee payment: 11 |
|
LAPS | Lapse due to unpaid annual fee |