KR960012131A - Error Signal Generation Method for Efficient Learning of Multi-layer Perceptron Neural Networks - Google Patents

Error Signal Generation Method for Efficient Learning of Multi-layer Perceptron Neural Networks Download PDF

Info

Publication number
KR960012131A
KR960012131A KR1019940025170A KR19940025170A KR960012131A KR 960012131 A KR960012131 A KR 960012131A KR 1019940025170 A KR1019940025170 A KR 1019940025170A KR 19940025170 A KR19940025170 A KR 19940025170A KR 960012131 A KR960012131 A KR 960012131A
Authority
KR
South Korea
Prior art keywords
error signal
generation method
signal generation
output
neural networks
Prior art date
Application number
KR1019940025170A
Other languages
Korean (ko)
Other versions
KR0141341B1 (en
Inventor
오상훈
Original Assignee
양승택
재단법인 한국전자통신연구소
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 양승택, 재단법인 한국전자통신연구소 filed Critical 양승택
Priority to KR1019940025170A priority Critical patent/KR0141341B1/en
Priority to JP6313861A priority patent/JP2607351B2/en
Publication of KR960012131A publication Critical patent/KR960012131A/en
Application granted granted Critical
Publication of KR0141341B1 publication Critical patent/KR0141341B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Neurology (AREA)
  • Image Analysis (AREA)
  • Feedback Control In General (AREA)

Abstract

다충퍼셉트론으 역전파 학습 알고리즘 성능을 개선하기 위하여, 새로운 오차함수를 제안한다. 제안된 오차함수는 출력충의 목표값이 출력값과의 차이가 많이 날수록 오차신호를 크게 발생시켜, 신경망의 학습과정에서 출력노드가 부적절하게 호화되는 현상을 줄여준다. 또, 출력층의 목표값이 출력값과 가까워지면 오차신호가 작게 발생하여 신경망이 학습패턴에 과도하게 학습되는 것을 막아준다.In order to improve the performance of the back-propagation learning algorithm with M. perceptron, a new error function is proposed. The proposed error function generates an error signal as the target value of the output buffer becomes more different from the output value, thereby reducing the improper luxury of the output node during the neural network learning process. In addition, when the target value of the output layer approaches the output value, an error signal is generated small, thereby preventing the neural network from being excessively trained in the learning pattern.

Description

다충퍼셉트론 신경회로망의 효율적인 학습을 위한 오차신호 발생방법Error Signal Generation Method for Efficient Learning of Multi-Perceptron Neural Networks

본 내용은 요부공개 건이므로 전문내용을 수록하지 않았음As this is a public information case, the full text was not included.

제1도는 다충퍼셉트론 신경회로망의 구조,Figure 1 shows the structure of the multiworm perceptron neural network,

제4도는 효율적 학습을 위한 오차신오 제안도.4 is an error Shino suggestion for efficient learning.

Claims (1)

생명체의 정보처리를 모방한 신경회로망 모델의 하나로서, 뉴런(neuron)을 뜻하는 노드들과, 각 노드들을 연결하는 시냅스 가중치들이 계층적으로 구성되어 있는 다충퍼셉트론에서 오차신호를 발생시키는 방법에 있어서, 상기 다충퍼셉트론의 역전파학습시 출력노드가 부적절하게 포화될 수록 강한 오차신호를 발생시키고, 상기 출력노드가 적절하게 포함될 수록 약한 오차신호를 발생시키는 것을 특징으로 하는 다충퍼셉트론 신경회로망의 효율적인 학습을 위한 오차신호 발생방법.As a neural network model that mimics information processing of living organisms, a method of generating an error signal in a multi-order perceptron having nodes hierarchically composed of nodes representing neurons and synaptic weights connecting the nodes In the backpropagation learning of the multiworm perceptron, when the output node is inadequately saturated, the strong error signal is generated, and when the output node is properly included, the weak error signal is generated. Error signal generation method. ※ 참고사항 : 최초출원 내용에 의하여 공개하는 것임.※ Note: The disclosure is based on the initial application.
KR1019940025170A 1994-09-30 1994-09-30 Error signal generation method for efficient learning of multilayer perceptron neural network KR0141341B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1019940025170A KR0141341B1 (en) 1994-09-30 1994-09-30 Error signal generation method for efficient learning of multilayer perceptron neural network
JP6313861A JP2607351B2 (en) 1994-09-30 1994-12-16 Error Signal Generation Method for Efficient Learning of Multilayer Perceptron Neural Network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1019940025170A KR0141341B1 (en) 1994-09-30 1994-09-30 Error signal generation method for efficient learning of multilayer perceptron neural network

Publications (2)

Publication Number Publication Date
KR960012131A true KR960012131A (en) 1996-04-20
KR0141341B1 KR0141341B1 (en) 1998-07-01

Family

ID=19394262

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1019940025170A KR0141341B1 (en) 1994-09-30 1994-09-30 Error signal generation method for efficient learning of multilayer perceptron neural network

Country Status (2)

Country Link
JP (1) JP2607351B2 (en)
KR (1) KR0141341B1 (en)

Also Published As

Publication number Publication date
JP2607351B2 (en) 1997-05-07
JPH08115310A (en) 1996-05-07
KR0141341B1 (en) 1998-07-01

Similar Documents

Publication Publication Date Title
Hagiwara Novel backpropagation algorithm for reduction of hidden units and acceleration of convergence using artificial selection
Fahlman The recurrent cascade-correlation architecture
Nauck et al. Designing neuro-fuzzy systems through backpropagation
Suzuki et al. Intelligent agent system for human-robot interaction through artificial emotion
Lin et al. Adaptive fuzzy command acquisition with reinforcement learning
Touretzky Connectionism and compositional semantics
Fletcher et al. Combining prior symbolic knowledge and constructive neural network learning
Reyneri Weighted radial basis functions for improved pattern recognition and signal processing
Shastri Types and quantifiers in SHRUTI–a connectionist model of rapid reasoning and relational processing
Yu et al. An adaptive activation function for multilayer feedforward neural networks
KR960012131A (en) Error Signal Generation Method for Efficient Learning of Multi-layer Perceptron Neural Networks
Dyer Lexical acquisition through symbol recirculation in distributed connectionist networks
Kasabov et al. Hybrid intelligent adaptive systems: A framework and a case study on speech recognition
Peng et al. Adaptive nonmonotone conjugate gradient training algorithm for recurrent neural networks
Hattori et al. Quick learning for multidirectional associative memories
Erenshteyn et al. Is designing a neural network application an art or a science?
Lara Artificial neural networks: An introduction
Stenning A Speech Based Connectionist Model of Human Short Term Memory
Rueckl Connectionism and the Notion of Levels
Omori et al. Incremental knowledge acquisition architecture that is driven by the emergence of inquiry conversation
Becerra et al. Self pruning Gaussian Synapse Networks for behavior based robots
Rossi Connectionism and the emergence of knowledge
Tadj et al. On a Fuzzy DVQ Algorithm for Speech Recognition
Wang et al. Research and design of a fuzzy neural expert system
Jou et al. Parallel distributed processing with multiple one-output back-propagation neural networks

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20080307

Year of fee payment: 11

LAPS Lapse due to unpaid annual fee