WO2022152364A1 - Method and analysis system for the classification of parameters - Google Patents
Method and analysis system for the classification of parameters Download PDFInfo
- Publication number
- WO2022152364A1 WO2022152364A1 PCT/EP2021/050465 EP2021050465W WO2022152364A1 WO 2022152364 A1 WO2022152364 A1 WO 2022152364A1 EP 2021050465 W EP2021050465 W EP 2021050465W WO 2022152364 A1 WO2022152364 A1 WO 2022152364A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- output
- input
- neurons
- neuron
- synapses
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 33
- 210000004205 output neuron Anatomy 0.000 claims abstract description 114
- 230000006870 function Effects 0.000 claims abstract description 62
- 210000000225 synapse Anatomy 0.000 claims abstract description 57
- 230000005540 biological transmission Effects 0.000 claims abstract description 54
- 210000002364 input neuron Anatomy 0.000 claims abstract description 52
- 230000008878 coupling Effects 0.000 claims abstract description 46
- 238000010168 coupling process Methods 0.000 claims abstract description 46
- 238000005859 coupling reaction Methods 0.000 claims abstract description 46
- 238000010304 firing Methods 0.000 claims abstract description 35
- 238000004088 simulation Methods 0.000 claims abstract description 32
- 206010000210 abortion Diseases 0.000 claims abstract description 17
- 231100000176 abortion Toxicity 0.000 claims abstract description 17
- 230000001537 neural effect Effects 0.000 claims abstract description 15
- 230000001747 exhibiting effect Effects 0.000 claims abstract description 6
- 238000012546 transfer Methods 0.000 claims abstract description 6
- 238000005457 optimization Methods 0.000 claims abstract description 5
- 238000012545 processing Methods 0.000 claims description 27
- 230000000694 effects Effects 0.000 claims description 25
- 238000012549 training Methods 0.000 claims description 24
- 230000002401 inhibitory effect Effects 0.000 claims description 22
- 230000002964 excitative effect Effects 0.000 claims description 20
- 238000011156 evaluation Methods 0.000 claims description 16
- 210000002569 neuron Anatomy 0.000 claims description 14
- 210000000063 presynaptic terminal Anatomy 0.000 claims description 9
- 230000001419 dependent effect Effects 0.000 claims description 5
- 210000003050 axon Anatomy 0.000 claims description 4
- 230000007935 neutral effect Effects 0.000 claims description 4
- 230000001808 coupling effect Effects 0.000 claims description 2
- 230000036982 action potential Effects 0.000 abstract description 46
- 238000012544 monitoring process Methods 0.000 abstract description 10
- 230000004913 activation Effects 0.000 description 9
- 230000008569 process Effects 0.000 description 9
- 230000036541 health Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 239000012528 membrane Substances 0.000 description 2
- 206010019233 Headaches Diseases 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 210000004027 cell Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000002860 competitive effect Effects 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 231100001261 hazardous Toxicity 0.000 description 1
- 231100000869 headache Toxicity 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013021 overheating Methods 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000036962 time dependent Effects 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/088—Non-supervised learning, e.g. competitive learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
Definitions
- the present invention relates to a method for the machine-aided optimization of the classification of a number of parameters exhibiting parameter values into classes, to which the parameters are linked by coupling coefficients.
- an artificial neuronal network (ANN) is used for the classification of the parameters.
- the ANN has an input layer with a plurality of input neurons, the number thereof being selected according to the number of parameters as well as an output layer with output neurons, the number thereof being selected according to the number of classes.
- the input and output neurons are connected via synapses allowing a signal transfer from the input neurons to the output neurons, whereby each of the synapses includes an adjustable transmission function.
- the transmission function thereby corresponds to the coupling coefficient between a parameter value (input neuron) and a corresponding class (output neuron).
- the transmission function can vary between an inhibitory and excitatory effect on the activity of the connected output neurons and it comprises a value specifying the amount of the inhibitory or excitatory effect.
- the activity of each input and output neuron is indicated by the rate of its action potentials (AP) which again defines a pulse signal with a frequency according to the activity of the neuron.
- AP action potentials
- An inhibitory effect of the transmission function leads to a reduction of the frequency of the pulse signal whereas an excitatory transmission function leads to an increase of the frequency. If for example normalized coupling coefficients are used, then a coupling coefficient of -1 would be fully inhibitory, reducing the frequency of the connected output neuron by a maximum amount.
- the value 0.0 would be neutral, meaning that it has no effect on the frequency of the output neuron, whereas the value +1.0 would have a maximum excitatory effect, meaning that the frequency of the pulse signal of the output neuron is increased by a maximum value.
- the activity of the output neuron is dependent on the nonlinear sum of the transmission functions of all incoming synapses, e.g. according to the Goldman equation.
- the rate of the APs of the input neurons is set according to the parameter values obtained from a data input whereas the rate of the APs of the output neurons depends as mentioned above on the sum of the signals obtained from all incoming synapses which signals are again dependent on the transmission function of the corresponding synapses, i.e. which signals are highly non-linear and time-dependent.
- the signals obtained from the synapses via the transmission function corresponds to a pulse train with a frequency corresponding to the action potential of the input neuron affected by the amount of inhibitory or excitatory effect of the transmission function.
- the output neuron has a activation function, which sums up the frequency and the excitatory or inhibitory effect of each incoming signal and thus generates a pulse signal with a frequency depending on this sum of the incoming signals from all synapses.
- the activation function could e.g. correspond to the Goldman function, which is used to calculate the membrane potential of a biological neuron.
- an initialization step is performed wherein an initialization database containing the correlation coefficients between the input and output neurons - i.e. between the parameters and the classes - is used whereby in this initialization step, the transmission function or coupling coefficients of all synapses is computed (one-shot learning) according to the correlation coefficients from the initialization database.
- each output neuron After initialization the artificial neuronal network is run in a simulation step, whereby the rate of the APs of each output neuron adjusts according to the sum of signals of the transmission functions of its incoming synapses.
- each output neuron via its activation function each output neuron generates its own action potentials, i.e. the frequency of its generated pulse signal.
- the simulation step is e.g. terminated after a set time.
- the set time could be for example 1 s or 10 s.
- the simulation step is terminated after an abortion criterion is fulfilled.
- Abortion criteria for simulations are per se known in the art. Such abortion criteria may for example comprise the distance of the AP rate of the most active output neuron from the AP rate of the second most active output neuron.
- Other abortion criteria are the absolute value of the AP rate of the most active output neuron and/or the conversion of the activity (AP rates) of the output neurons. Of course, these different output criteria can be applied solely or in combination.
- the operation parameters of a car as e.g. motor speed, car speed, spoiler angle, fuel consumption, motor temperature, brake disc temperature and tire temperature can be fed on-time to the analysis system and the classes to be monitored may be different hazardous situation classes as e.g. piston jamming, motor overheating, problems with the turbo engine etc.. If the value of such a failure class becomes too high, immediately steps can be undertaken.
- normalized values are used for the parameter values and/or for the class values, so that parameters values as well as class values differ between 0 and 1. This is the easiest way to handle and evaluate the parameter values, the coupling coefficients as well as the class values.
- the normalized values are directly transferred to a given frequency range. For example, if the minimum frequency of the activity potential of a neuron is 0 Hz and the maximum frequency is 500 Hz, then a parameter value or class value of 0.5 would correspond to a frequency of 250 Hz. Of course, these values are only exemplary.
- -1.0 and 1.0 For the coupling coefficients also values between -1.0 and 1.0 can be taken, whereby -1.0 is maximally inhibitory, 1.0 is maximally excitatory and 0 is neutral, which correspond to the values 0, 0.5 and 1.0 for the correlation coefficients from the initial database. These values are used for computation of the coupling coefficients in the network (e.g. one-shot learning).
- the AP rate of the output neurons after the simulation step face a clearly wrong value.
- This failure is evident if the AP rate of at least one output neuron fails to meet validity criteria based on training data which is used for the evaluation of the simulation result.
- the AP rate of the failing output neuron is set to a class value from the training data and now the transmission functions of the incoming synapses are altered in a way that they lead to the class value from the training database. With this altered coupling coefficients, the simulation step is repeated which now leads to a correct result.
- This altering or changing of the transmission function of the incoming synapses is done in a way that the transmission function is altered to excitatory if the AP rate (or firing rate) of input neuron and output neuron are both higher than 0.5 and the transmission function is altered to inhibitory if at least one normalized AP rate either of the input neuron or of the output neuron is smaller than 0.5.
- the enhancing of the excitatory and inhibitory coupling between two neurons is done via Hebbian learning which is a well-known method for retraining the transmission functions of synapses.
- the invention also refers to an analysis system for the machine-aided optimization of the classification of parameter data, which comprise several parameters and corresponding parameter values as well as classes to which the parameters are going to be classified and to which they are linked via coupling coefficients.
- the analysis system comprises a data input for transferring the parameter data into the analysis system.
- the analysis system comprises a processing logic with an artificial neuronal network (ANN) having an input layer with several input neurons as well as an output layer with several output neurons.
- the number of the input neurons is selectable according to the number of parameters and the number of output neurons is selectable according to the number of classes.
- the activity of each neuron is defined by its activity potential which again is specified by the frequency of a pulse signal generated by it.
- the input and output neurons are connected via synapses allowing a signal transfer in the direction of the output neurons.
- Each of the synapses comprises a transmission function which is variable between an inhibitory and excitatory coupling effect.
- the synapses transfer the frequency pulse signal of the input neuron and additionally the transmission function, deciding whether this pulse train has an inhibitory or excitatory effect, and the strength of the effect.
- the processing logic is configured to set the AP rate of the input neurons dependent on the parameter values from the data input.
- the output neuron has an activation function which sums up the signals obtained from its incoming synapses and generates a pulse signal accordingly. This function simulates the activation of a cell in a brain, e.g. according to the Goldman equation.
- the analysis system comprises an initialization database comprising correlation coefficients between the parameters and the classes.
- the processing logic is configured to set the transmission function of the synapses in an initialization step according to the correlation coefficients from the initialization database.
- the transmission function does not only comprise information whether the coupling is inhibitory or excitatory, but also the strength of the inhibitory/excitatory coupling.
- This strength according to the coupling coefficient has an effect how much each pulse of the transferred pulse train from the input neuron has an effect on the activation function of the connected output neuron and thus on its activity potential. Via this transmission behaviour, the kind of coupling as well as the strength of the coupling can be very well simulated.
- the processing logic is configured to run a simulation step, whereby the AP rate of each output neuron adjusts according to the transmission functions of the incoming synapses.
- the processing logic is further configured to terminate the simulation step b) after a set time or after a per se known abortion criterion is fulfilled.
- the processing logic is configured to calculate and output the classification values from the AP rate of the output neurons. If for example the frequency of the simulation process in the ANN is 1kHz, one second of running the ANN would already include 1000 simulation steps, where the AP rate of the output neurons adjust to the AP rate of the input neurons (parameter values) as well as to the coupling coefficients (transmission function of the synapses).
- the abortion criterion for stopping the simulation process is the exceeding of a defined distance between the AP rates of the two output neurons exhibiting the highest AP rate and/or the exceeding of a defined AP rate threshold value by at least one output neuron and/or the convergence of the AP rates of the output neurons.
- These abortion criteria are per se known and have proved to be reliable criteria for the termination of a simulation process.
- This inventive artificial neuronal network is the fact that by the feeding with the correlation coefficients from an initialization database it is possible to direct the simulation process already from the beginning into the right direction which strongly reduces the overall computation time which is normally necessary in back propagation networks which have to amend the results over a larger number of generations. Tests with this inventive analysis system revealed excellent values after very short time.
- the processing logic comprises an evaluation unit and the training database comprising class values of the classes.
- the evaluation unit is configured to evaluate the validity of the AP rates of the output neurons by comparison with class values from the training database.
- the processing logic is configured to set the AP rate of at least of the output neuron(s) failing the validity evaluation to a corresponding class value from the training database.
- the transmission function of the incoming synapses to the failing output neuron is altered in a way as to enhance a positive as well as negative coupling of the input neurons and the failing output neuron.
- the processing logic is configured to start the simulation step b) anew with the altered transmission functions. Therefore, the analysis system comprises an correction control which allows a retraining and a correction of obviously wrong results or partial results.
- the processing logic is configured to alter the transmission functions according to Hebbian learning.
- the present invention allows a fast and efficient classification of parameters but also a monitoring of parameters on failure classes. This monitoring can even be performed in real-time as the simulation process is very fast.
- This classification table could be that of a job agency. Also here, different parameters are linked to the job seeking persons, which is necessary to classify them according to certain job classes on the market. While the classes are specified in the upper line, the parameters are indicated in the column, titled "question”.
- the inventive system is able to tailor the retrieving or requesting of parameters from the users by asking for parameters which exclude as many other competitive classification results as possible
- the steadystate that is achieved after a number of iterations allows the identification of the input neurons that are most significant for the current diagnosis. This result depends strongly on the "context", the history of the network's activation and the input that is already available.
- Another field of application is the health management e.g. in a company, where a certain business type of the company has a large influence on the type of health risk management.
- Parameters for health management are type of business, number of employees, ratio male/female, e.g. for each department separately, e.g. for the manufacture department, sales department, administration department, management department: department/corresponding operating field, employee data and health risk evaluation.
- the invention can particularly be used for failure monitoring.
- failure classes are formed from known failure situations. Now the parameters can be monitored in real-time if their values lead to relevant AP rates in the output neurons of failure classes. With the invention also the tendency in failure classes can be used to avoid a failure situation.
- a further application field of the invention can be building monitoring.
- the management of large buildings involves a variety of monitored operation parameters as temperature, humidity, movement of the building (in high skyscrapers), elevator status, traffic flow, sunshine intensity and - direction, energy consumption, noise emission. These parameters can be used to regulate control of air conditioning, elevator group control, traffic flow guide, operation of generators etc.
- the problems are similar to the above mentioned race car problem, where the monitoring may face an illicit or poor or inferior status condition and then tries to analyze the complete building condition to look for countermeasures by changing other operation parameters to counteract the illicit or inferior status (e.g. temperature).
- the processing unit comprises in or in addition to the database a reference value memory in which the reference values of all possible operation parameters of the operating system are stored.
- the reference value can of course be a reference value range.
- inventive analysis system and method is not only able to check a system of parameters on fixed class values, but to monitor changes in the class values over the time. This can be done to predict failure situations in advance.
- the simulation time of the ANN can be set until the end of the monitoring period and the AP rate of the at least most active output neurons is outputted continuously.
- communication channel - link - synapse - connection AP rate - frequency of the firing pulse of a neuron - firing rate - activity of a neuron; pulse signal - pulse train; transmission function of the synapse - function mirroring the coupling coefficient between a parameter and a class, i.e. between an input neuron and an output neuron;
- inventive system comprises soft- and hardware components having the above mentioned combined functionality and that the inventive method describes the functionality which can be performed by the inventive system.
- inventive method describes the functionality which can be performed by the inventive system.
- features from the inventive system may be employed for the inventive method and vice versa.
- this kind of system except the corresponding wiring and sensors are implemented in a software product launched on a computer, so that the different components of the system as described above are realized by software modules of the software product.
- Fig. 1 shows a schematic diagram of the analysis system with an artificial neuronal network
- Fig. 2 a detailed presentation of an input neuron which is connected via a synapse with an output neuron.
- Fig. 1 shows an analysis system 10 for the classification of parameters that are fed to the analysis system 10 via data input 12.
- the analysis system 12 comprises a processing logic 14 with an artificial neuronal network 16 having an input layer 18 with input neurons 20a-20d and an output layer 22 with output neurons 24a-24c which are connected with preferably all of the input neurons 20a-20d via synapses 26.
- the processing logic 14 further comprises an evaluation unit 32, an initialization database 31, a training database 34 as well as an output device 34, for example a monitor.
- each synapse 26 has a longitudinal axon part 28 is connected to the input neuron 20a-20d.
- the axon part 28 is connected with an axon terminal 30 which is connected to the output neuron 24a-c.
- the transmission function of the synapse 26 is integrated preferably in the axon terminal 30.
- the transmission function in the axon terminal 30 alters each pulse of the pulse signal according to the AP rate of the input neuron 20a-d to an output signal which contributes positively or negatively to the AP rate (firing rate) of the output neuron 24a-c.
- the analysis system works as follows:
- the analysis system 10 optimizes the classification of parameters by the use of the artificial neuronal network ANN 16.
- the ANN 16 performs a simulation of the "think process" of a biological brain.
- the ANN 16 has to be adapted to the present classification task by setting the number of input neurons 20a-d according to the number and type of parameters and by setting the number of output neurons 24a-c according to the number of classes.
- the analysis system 10 comprises an initialization database 31 which is fed with the correlation coefficients between the parameters and the classes. Accordingly, during the initialization stage, the transmission function of the neurons, i.e. the transmission function of the axon terminals 30, is set according to the correlation coefficients that are contained in the initialization database. Accordingly, the transmission function in the axon terminals 30 of the synapses 26 are set to inhibitory or excitatory and also the amount of inhibition or excitation is set according to the coupling coefficients which are shown for example in the table 1 above. In this connection it is important to mention that preferably normalized data are processed. For example, a parameter value, e.g. the speed of a motor, e.g. 3000 rpm is set in relation to its maximum value, e.g. 6000 rpm. In this case the normalized value for the speed is 0,5.
- a parameter value e.g. the speed of a motor, e.g. 3000 rpm is set in relation to its maximum value, e
- the neurons have a maximal AP rate which is e.g. 500 Hz. If an input neuron is e.g. to be set to a parameter value of 0,5, its firing rate is set to 250 Hz.
- the analysis system 10 obtains parameter data via the data input 12 and possibly normalizes it.
- the data can be input in-time, e.g. the feeding of operation parameters of a car in a car race, e.g. the motor speed.
- the AP rate of the input neurons 20a-d leads with each pulse to a transmission event of the synapse 26 to the output neuron 24a-c which is according to the transmission function of the synapse 26 in the axon terminal 30 is inhibitory or excitatory in an amount according to the value of the coupling coefficient.
- the output neuron is fed with a frequency according to the AP-rate of the input neuron 20a-d with a more or less inhibitory or excitatory output signal.
- the output signals of all synapses are summed up via an activation function in the output neuron, leading to the generation of an AP rate, i.e.
- the AP rate specifies the class value. If e.g. the p firing rate is 350 Hz, the normalized value with a maximum frequency of 500 Hz is 0,7 which is a quite high AP rate. The maximum normalized AP rate is 1 the lowest is 0.
- the ANN has to run a certain time. If the ANN is run with 1 kHz, then after 10 ms, 10 processing cycles have been performed which can already be sufficient for the output neurons 24a-c to "swing" into their final AP rate. Accordingly, when the artificial neuronal network runs over a certain time, the firing rate of the output neurons "swings" into its final AP rate corresponding to the sum of the inhibitory and excitatory output signals coming from all of its incoming synapses 26.
- the simulation process can also be stopped after certain abortion criteria are met as for example obtaining a minimum absolute p firing rate , e.g. 350 Hz of the most active output neuron and/or having a certain gap between between the AP rate of the most active output neuron and the AP rate of the second active output neuron, e.g. 150 Hz.
- a minimum absolute p firing rate e.g. 350 Hz of the most active output neuron and/or having a certain gap between between the AP rate of the most active output neuron and the AP rate of the second active output neuron, e.g. 150 Hz.
- the processing logic either outputs the corresponding class value, i.e. the AP rate at least of the most active output neurons 24a-c to an output device 34, for example a monitor.
- the classification result is the pfiring rate of the output neurons, which can be normalized if desired.
- the analysis system 10 is able to evaluate the classification result via its evaluation unit 32.
- the evaluation unit 32 is connected with the training database 32 and the training database 32 has training sets of class values.
- the evaluation unit 31 compares the activity potential of the output neurons 24a-c with the class values from the training database and if the difference between a class value of the training set and a class value corresponding to the activity potential of at least one output neuron 24a-c is too high, this is evaluated as a failure.
- the corresponding failing output neuron 24a-c is set to an activity potential according to the class value of the training set and the transmission function of the synapses is altered so that the now set activity potential of the output neuron 24a-c from the training database 34 corresponds to the sum of all incoming output signals of the connected synapses 26.
- the simulation is again run with the altered transmission functions and again the simulation is performed until the set runtime is lapsed or until at least one the above-mentioned abortion criteria is obtained.
- the re-trained result can again be evaluated in the evaluation unit. If the result is evaluated as being in line with the training dataset, the result is output to the output device 34.
- the advantage of the inventive analysis system and method is the fact that a result is obtained much sooner than in well-known back propagation networks where either multiple layers are used and/or the result has to be re-fed to an input layer.
- the analysis system can be used not only for classification purposes, but also for monitoring of parameter data, particularly if the parameter data are monitored with respect to failure classes.
- the computational effort of the inventive neuronal network is very low, the data can be processed very fast and online. Accordingly, the invention offers a wide variety of applications in technical failure monitoring as well as in all kinds of classification tasks.
- AP describes a single spike of the membrane voltage of the neuron.
- the rate or frequency of the APs describes the neuron's activity.
- ANN 16 artificial neuronal network
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention relates to a method for the machine-aided optimization of the classification of a number of parameters exhibiting parameter values into classes, to which the parameters are linked by coupling coefficients, in which method an artificial neuronal network (ANN) (16) is used for the classification of the parameters, which ANN having an input layer (18) with a plurality of input neurons (20a-d), the number thereof being selected according to the number of parameters as well as an output layer (22) with output neurons (24a-c), the number thereof being selected according to the number of classes, which input and output neurons (20, 24) are connected via synapses (26) allowing a signal transfer from the input neurons (20a-d) to the output neurons (24a-c), whereby each of the synapses (23a, 23b) includes an adjustable transmission function, whereby the sequence of action potentials ( firing frequency) of each input and output neuron (20a-d, 24a-c) is characterized by the frequency of a pulse signal generated by it, whereby the firing frequency of the input neurons (20a-d) is set according to the parameter values from the data input (12) and whereby the firing frequency of the output neurons (24a-c) depends on the non-linear sum of all signals obtained from all incoming synapse(s) (23a, 23b). According to the invention a) in initialization step an initialization data base (31) is used which contains the correlation coefficients between the parameters and classes, in which initialization step the transmission function of all synapses (26) is set according to correlation coefficients from the initialization data base (31), b) in a simulation step the ANN (16) is run, in which the firing frequency of the output neurons (24a-c) adjusts according to the number and transmission functions of the incoming synapses (26), c) the simulation step is terminated after a set time or after an abortion criterion is fulfilled or after it is stopped manually, and d) from the firing frequency of the output neurons (24a-c) class values are obtained. This method allows a very fast classification of parameter data, particularly for monitoring purposes.
Description
Method and analysis system for the classification of parameters
The present invention relates to a method for the machine-aided optimization of the classification of a number of parameters exhibiting parameter values into classes, to which the parameters are linked by coupling coefficients. In the method, an artificial neuronal network (ANN) is used for the classification of the parameters. The ANN has an input layer with a plurality of input neurons, the number thereof being selected according to the number of parameters as well as an output layer with output neurons, the number thereof being selected according to the number of classes. The input and output neurons are connected via synapses allowing a signal transfer from the input neurons to the output neurons, whereby each of the synapses includes an adjustable transmission function. The transmission function thereby corresponds to the coupling coefficient between a parameter value (input neuron) and a corresponding class (output neuron). The transmission function can vary between an inhibitory and excitatory effect on the activity of the connected output neurons and it comprises a value specifying the amount of the inhibitory or excitatory effect. Generally, the activity of each input and output neuron is indicated by the rate of its action potentials (AP) which again defines a pulse signal with a frequency according to the activity of the neuron. An inhibitory effect of the transmission function leads to a reduction of the frequency of the pulse signal whereas an excitatory transmission function leads to an increase of the frequency. If for example normalized coupling coefficients are used, then a coupling coefficient of -1 would be fully inhibitory, reducing the frequency of the connected output neuron by a maximum amount. The value 0.0 would be neutral, meaning that it has no effect on the frequency of the output neuron, whereas the value +1.0 would have a maximum excitatory effect, meaning that the frequency of the pulse signal of the output neuron is increased by a maximum value. Of course, the activity of the output neuron is dependent on the nonlinear sum of the transmission functions of all incoming synapses, e.g. according to the Goldman equation.
Generally, the rate of the APs of the input neurons is set according to the parameter values obtained from a data input whereas the rate of the APs of the output neurons depends as mentioned above on the sum of the signals obtained from all incoming synapses which signals are again dependent on the transmission function of the corresponding synapses, i.e. which signals are highly non-linear and time-dependent. Generally, the signals obtained from the synapses via the transmission function corresponds to a pulse train with a frequency corresponding to the action potential of the input neuron affected by the amount of inhibitory or excitatory effect of the transmission function.
The output neuron has a activation function, which sums up the frequency and the excitatory or inhibitory effect of each incoming signal and thus generates a pulse signal with a frequency depending on this sum of the incoming signals from all synapses. The activation function could e.g. correspond to the Goldman function, which is used to calculate the membrane potential of a biological neuron.
According to the invention, first an initialization step is performed wherein an initialization database containing the correlation coefficients between the input and output neurons - i.e. between the parameters and the classes - is used whereby in this initialization step, the transmission function or coupling coefficients of all synapses is computed (one-shot learning) according to the correlation coefficients from the initialization database.
Now, after initialization the artificial neuronal network is run in a simulation step, whereby the rate of the APs of each output neuron adjusts according to the sum of signals of the transmission functions of its incoming synapses. Thus, via its activation function each output neuron generates its own action potentials, i.e. the frequency of its generated pulse signal.
The simulation step is e.g. terminated after a set time. For example, if the artificial neuronal network is run with a frequency of 10 kHz, the set time could be for example 1 s or 10 s. Alternatively, the simulation step is terminated after an abortion criterion is fulfilled. Abortion criteria for simulations are per se known in the art. Such abortion criteria may for example comprise the distance of the AP rate of the most active output neuron from the AP rate of the second most active output neuron. Other abortion criteria are the absolute value of the AP rate of the most active output neuron and/or the conversion of the activity (AP rates) of the output neurons. Of course, these different output criteria can be applied solely or in combination. Thus, following combination of abortion criteria may be applied: Reaching a certain absolute AP rate of the most active output neuron and additionally ensuring a certain distance of the activity potentials of the most active output neuron to the second most active output neuron. Finally, the activity potentials of the output neurons are the class values obtained in the simulation.
With the invention, it is not only possible to classify data but to monitor parameters on the existence of failures. For example the operation parameters of a car, as e.g. motor speed, car speed, spoiler angle, fuel consumption, motor temperature, brake disc temperature and tire temperature can be fed on-time to the analysis system and the classes to be monitored may be different hazardous situation classes as e.g. piston jamming, motor overheating, problems with the turbo engine etc.. If the value of such a failure class becomes too high, immediately steps can be undertaken.
Preferably, normalized values are used for the parameter values and/or for the class values, so that parameters values as well as class values differ between 0 and 1. This is the easiest way to handle and evaluate the parameter values, the coupling coefficients as well as the class values. With respect to the frequency of the pulse signals, the normalized values are directly transferred to a given frequency range. For example, if the minimum frequency of the activity potential of a neuron is 0 Hz and the maximum frequency is 500 Hz, then a parameter value or class value of 0.5 would correspond to a frequency of 250 Hz. Of course, these values are only exemplary.
For the coupling coefficients also values between -1.0 and 1.0 can be taken, whereby -1.0 is maximally inhibitory, 1.0 is maximally excitatory and 0 is neutral, which correspond to the values
0, 0.5 and 1.0 for the correlation coefficients from the initial database. These values are used for computation of the coupling coefficients in the network (e.g. one-shot learning).
Of course, it may happen that the AP rate of the output neurons after the simulation step face a clearly wrong value. This failure is evident if the AP rate of at least one output neuron fails to meet validity criteria based on training data which is used for the evaluation of the simulation result. In this case, the AP rate of the failing output neuron is set to a class value from the training data and now the transmission functions of the incoming synapses are altered in a way that they lead to the class value from the training database. With this altered coupling coefficients, the simulation step is repeated which now leads to a correct result. This altering or changing of the transmission function of the incoming synapses is done in a way that the transmission function is altered to excitatory if the AP rate (or firing rate) of input neuron and output neuron are both higher than 0.5 and the transmission function is altered to inhibitory if at least one normalized AP rate either of the input neuron or of the output neuron is smaller than 0.5. This means positive (excitatory) coupling as well as negative (inhibitory) coupling is enhanced. This can be done for example by making the change of the coupling dependent on the product of the Aps of input and output neuron. Preferably, the enhancing of the excitatory and inhibitory coupling between two neurons is done via Hebbian learning which is a well-known method for retraining the transmission functions of synapses.
The invention also refers to an analysis system for the machine-aided optimization of the classification of parameter data, which comprise several parameters and corresponding parameter values as well as classes to which the parameters are going to be classified and to which they are linked via coupling coefficients.
The analysis system comprises a data input for transferring the parameter data into the analysis system. The analysis system comprises a processing logic with an artificial neuronal network (ANN) having an input layer with several input neurons as well as an output layer with several output neurons. The number of the input neurons is selectable according to the number of parameters and the number of output neurons is selectable according to the number of classes. The activity of each neuron is defined by its activity potential which again is specified by the frequency of a pulse signal generated by it. The input and output neurons are connected via synapses allowing a signal transfer in the direction of the output neurons. Each of the synapses comprises a transmission function which is variable between an inhibitory and excitatory coupling effect. Generally, the synapses transfer the frequency pulse signal of the input neuron and additionally the transmission function, deciding whether this pulse train has an inhibitory or excitatory effect, and the strength of the effect. The processing logic is configured to set the AP rate of the input neurons dependent on the parameter values from the data input. The output neuron has an activation function which sums up the signals obtained from its incoming synapses and generates a pulse signal accordingly. This function simulates the activation of a cell in a brain, e.g. according to the Goldman equation.
According to the invention, the analysis system comprises an initialization database comprising correlation coefficients between the parameters and the classes. The processing logic is configured
to set the transmission function of the synapses in an initialization step according to the correlation coefficients from the initialization database.
In this connection it is carried out that the transmission function does not only comprise information whether the coupling is inhibitory or excitatory, but also the strength of the inhibitory/excitatory coupling. This strength according to the coupling coefficient has an effect how much each pulse of the transferred pulse train from the input neuron has an effect on the activation function of the connected output neuron and thus on its activity potential. Via this transmission behaviour, the kind of coupling as well as the strength of the coupling can be very well simulated.
The processing logic is configured to run a simulation step, whereby the AP rate of each output neuron adjusts according to the transmission functions of the incoming synapses. The processing logic is further configured to terminate the simulation step b) after a set time or after a per se known abortion criterion is fulfilled. Finally, the processing logic is configured to calculate and output the classification values from the AP rate of the output neurons. If for example the frequency of the simulation process in the ANN is 1kHz, one second of running the ANN would already include 1000 simulation steps, where the AP rate of the output neurons adjust to the AP rate of the input neurons (parameter values) as well as to the coupling coefficients (transmission function of the synapses).
Preferably, the abortion criterion for stopping the simulation process is the exceeding of a defined distance between the AP rates of the two output neurons exhibiting the highest AP rate and/or the exceeding of a defined AP rate threshold value by at least one output neuron and/or the convergence of the AP rates of the output neurons. These abortion criteria are per se known and have proved to be reliable criteria for the termination of a simulation process.
The advantage of this inventive artificial neuronal network is the fact that by the feeding with the correlation coefficients from an initialization database it is possible to direct the simulation process already from the beginning into the right direction which strongly reduces the overall computation time which is normally necessary in back propagation networks which have to amend the results over a larger number of generations. Tests with this inventive analysis system revealed excellent values after very short time.
Of course it is possible to perform a retraining of the artificial neuronal network if at least one AP rate of an output neuron should really deviate clearly from class values of a training database which is used for the evaluation of the simulation result. Accordingly, preferably the processing logic comprises an evaluation unit and the training database comprising class values of the classes. The evaluation unit is configured to evaluate the validity of the AP rates of the output neurons by comparison with class values from the training database.
Hereby, the processing logic is configured to set the AP rate of at least of the output neuron(s) failing the validity evaluation to a corresponding class value from the training database. Now the
transmission function of the incoming synapses to the failing output neuron is altered in a way as to enhance a positive as well as negative coupling of the input neurons and the failing output neuron. This means that strong positive couplings become stronger and inhibitory couplings also become stronger, e.g. via Hebbian learning. After this retraining of the coupling coefficients, the processing logic is configured to start the simulation step b) anew with the altered transmission functions. Therefore, the analysis system comprises an correction control which allows a retraining and a correction of obviously wrong results or partial results.
Preferably, the processing logic is configured to alter the transmission functions according to Hebbian learning.
The present invention allows a fast and efficient classification of parameters but also a monitoring of parameters on failure classes. This monitoring can even be performed in real-time as the simulation process is very fast.
An example of a classification table is shown hereinafter. This classification table could be that of a job agency. Also here, different parameters are linked to the job seeking persons, which is necessary to classify them according to certain job classes on the market. While the classes are specified in the upper line, the parameters are indicated in the column, titled "question".
The values in the fields correspond to coupling coefficients, normalized to 1. Values above 0,5 describe positive couplings, and values below 0,5 characterize negative couplings, i.e. inhibitory effects. 0,5 is neutral, i.e. the parameter has no significant effect on the class. Looking to the left column to No 6. "hobbies" it is apparent that from the hobbies it is not possible to get information about the classes "worker". It has simply no influence, which is apparent from the value of 0,5 of all sports activities.
The intensity of the coupling is stronger the more the value is remote from 0,5. Excluding parameters would have e.g. values of 0, whereas a characterizing parameter would have a value near 1.
Tablel: Example of an initialization database with correlation coefficients between classes, here different lobs and parameters as education, experience, hobbies.
Particularly in configuration tasks there are a lot of mutually excluding parameters if attributes (such as "working experience") are subdivided into unique, exclusive parameters. These data structures are defined in the initial database and transformed into a suitable network structure through the initialization process described above. It has further to be considered that there are parameter groups which coexist independently, as. e.g. type of motor, outer appearance and inner appearance, although there also might be correlations between them (a certain appearance is only possible with a certain type of motor etc.).
The inventive system is able to tailor the retrieving or requesting of parameters from the users by asking for parameters which exclude as many other competitive classification results as possible Systems that do not supply a complete set of input data caused either by missing sensors or missing data input from other sources, a classification may nevertheless be attempted based on the available data. If the resulting evaluation described above shows that this result is invalid, the ANN can be used to query for additional data. This could imply e.g. asking for sensor data that must be acquired manually (hand-held temperature sensor, geometric measurements,...) or putting questions to a human user ("do you have a headache", ...).
Selection of which of the missing data is optimal in order to resolve the diagnosis (= classification) problem can be achieved by activating an appropriate feedback (additional circuits that are not used for the classification itself) and exploiting the random activation properties of the input neurons (possibly using an additional layer of neurons for this purpose). In this case, the steadystate that is achieved after a number of iterations allows the identification of the input neurons that are most significant for the current diagnosis. This result depends strongly on the "context", the history of the network's activation and the input that is already available.
This process reduces the number of additional measurements significantly and make the complete classification system easier to handle for complex tasks with many, partly unavailable sensors.
In the configuration of computer networks, there is a multitude of configuration parameters to consider which include the processing performance, the access and output of data from the network, the system architecture, the main software application, the estimated number of users, the price and maintenance of hardware components, the lifetime of hardware components, the cooling requirement etc.
Another field of application is the health management e.g. in a company, where a certain business type of the company has a large influence on the type of health risk management. Parameters for health management are type of business, number of employees, ratio male/female, e.g. for each department separately, e.g. for the manufacture department, sales department, administration
department, management department: department/corresponding operating field, employee data and health risk evaluation.
The invention can particularly be used for failure monitoring. On this behalf, failure classes are formed from known failure situations. Now the parameters can be monitored in real-time if their values lead to relevant AP rates in the output neurons of failure classes. With the invention also the tendency in failure classes can be used to avoid a failure situation.
A further application field of the invention can be building monitoring. The management of large buildings involves a variety of monitored operation parameters as temperature, humidity, movement of the building (in high skyscrapers), elevator status, traffic flow, sunshine intensity and - direction, energy consumption, noise emission. These parameters can be used to regulate control of air conditioning, elevator group control, traffic flow guide, operation of generators etc. The problems are similar to the above mentioned race car problem, where the monitoring may face an illicit or poor or inferior status condition and then tries to analyze the complete building condition to look for countermeasures by changing other operation parameters to counteract the illicit or inferior status (e.g. temperature).
Preferably, the processing unit comprises in or in addition to the database a reference value memory in which the reference values of all possible operation parameters of the operating system are stored.
The reference value can of course be a reference value range.
It should be understood that the inventive analysis system and method is not only able to check a system of parameters on fixed class values, but to monitor changes in the class values over the time. This can be done to predict failure situations in advance. In this case the simulation time of the ANN can be set until the end of the monitoring period and the AP rate of the at least most active output neurons is outputted continuously.
Following terms are used as synonyms: communication channel - link - synapse - connection; AP rate - frequency of the firing pulse of a neuron - firing rate - activity of a neuron; pulse signal - pulse train; transmission function of the synapse - function mirroring the coupling coefficient between a parameter and a class, i.e. between an input neuron and an output neuron;
It should be understood that the inventive system comprises soft- and hardware components having the above mentioned combined functionality and that the inventive method describes the functionality which can be performed by the inventive system. Thus, features from the inventive system may be employed for the inventive method and vice versa. Usually this kind of system except the corresponding wiring and sensors are implemented in a software product launched on a computer, so that the different components of the system as described above are realized by software modules of the software product.
The invention is described herein under by way of example with reference to an exemplary embodiment in conjunction with the schematic drawing, in which:
Fig. 1 shows a schematic diagram of the analysis system with an artificial neuronal network, and
Fig. 2 a detailed presentation of an input neuron which is connected via a synapse with an output neuron.
Fig. 1 shows an analysis system 10 for the classification of parameters that are fed to the analysis system 10 via data input 12. The analysis system 12 comprises a processing logic 14 with an artificial neuronal network 16 having an input layer 18 with input neurons 20a-20d and an output layer 22 with output neurons 24a-24c which are connected with preferably all of the input neurons 20a-20d via synapses 26. The processing logic 14 further comprises an evaluation unit 32, an initialization database 31, a training database 34 as well as an output device 34, for example a monitor.
The synapse 26 is shown in more detail in Fig. 2. Accordingly, each synapse 26 has a longitudinal axon part 28 is connected to the input neuron 20a-20d. On the side of the output neuron 24a-c the axon part 28 is connected with an axon terminal 30 which is connected to the output neuron 24a-c. The transmission function of the synapse 26 is integrated preferably in the axon terminal 30. The transmission function in the axon terminal 30 alters each pulse of the pulse signal according to the AP rate of the input neuron 20a-d to an output signal which contributes positively or negatively to the AP rate (firing rate) of the output neuron 24a-c.
The analysis system works as follows:
The analysis system 10 optimizes the classification of parameters by the use of the artificial neuronal network ANN 16. The ANN 16 performs a simulation of the "think process" of a biological brain.
In the beginning of the simulation or beforehand the ANN 16 has to be adapted to the present classification task by setting the number of input neurons 20a-d according to the number and type of parameters and by setting the number of output neurons 24a-c according to the number of classes.
Additionally, the analysis system 10 comprises an initialization database 31 which is fed with the correlation coefficients between the parameters and the classes. Accordingly, during the initialization stage, the transmission function of the neurons, i.e. the transmission function of the axon terminals 30, is set according to the correlation coefficients that are contained in the initialization database. Accordingly, the transmission function in the axon terminals 30 of the synapses 26 are set to inhibitory or excitatory and also the amount of inhibition or excitation is set according to the coupling coefficients which are shown for example in the table 1 above.
In this connection it is important to mention that preferably normalized data are processed. For example, a parameter value, e.g. the speed of a motor, e.g. 3000 rpm is set in relation to its maximum value, e.g. 6000 rpm. In this case the normalized value for the speed is 0,5.
The neurons have a maximal AP rate which is e.g. 500 Hz. If an input neuron is e.g. to be set to a parameter value of 0,5, its firing rate is set to 250 Hz.
The analysis system 10 obtains parameter data via the data input 12 and possibly normalizes it. In case of a monitoring function of the analysis system the data can be input in-time, e.g. the feeding of operation parameters of a car in a car race, e.g. the motor speed.
The AP rate of the input neurons 20a-d leads with each pulse to a transmission event of the synapse 26 to the output neuron 24a-c which is according to the transmission function of the synapse 26 in the axon terminal 30 is inhibitory or excitatory in an amount according to the value of the coupling coefficient. The output neuron is fed with a frequency according to the AP-rate of the input neuron 20a-d with a more or less inhibitory or excitatory output signal. In the output neuron, the output signals of all synapses are summed up via an activation function in the output neuron, leading to the generation of an AP rate, i.e. to a sequence of pulse signals with a certain frequency (firing rate) corresponding to the sum of incoming output signals of the synapses 26. The AP rate specifies the class value. If e.g. the p firing rate is 350 Hz, the normalized value with a maximum frequency of 500 Hz is 0,7 which is a quite high AP rate. The maximum normalized AP rate is 1 the lowest is 0.
To enable the output neurons to set their AP rates to the sum of incoming signals, the ANN has to run a certain time. If the ANN is run with 1 kHz, then after 10 ms, 10 processing cycles have been performed which can already be sufficient for the output neurons 24a-c to "swing" into their final AP rate. Accordingly, when the artificial neuronal network runs over a certain time, the firing rate of the output neurons "swings" into its final AP rate corresponding to the sum of the inhibitory and excitatory output signals coming from all of its incoming synapses 26.
The simulation process can also be stopped after certain abortion criteria are met as for example obtaining a minimum absolute p firing rate , e.g. 350 Hz of the most active output neuron and/or having a certain gap between between the AP rate of the most active output neuron and the AP rate of the second active output neuron, e.g. 150 Hz.
Other termination criteria are the convergence of the AP rates of the output neurons. The processing logic either outputs the corresponding class value, i.e. the AP rate at least of the most active output neurons 24a-c to an output device 34, for example a monitor.
Generally, the classification result is the pfiring rate of the output neurons, which can be normalized if desired.
Preferably, the analysis system 10 is able to evaluate the classification result via its evaluation unit 32. For a validation check of the classification data obtained in the output neurons 24a-c the evaluation unit 32 is connected with the training database 32 and the training database 32 has training sets of class values. The evaluation unit 31 compares the activity potential of the output neurons 24a-c with the class values from the training database and if the difference between a class value of the training set and a class value corresponding to the activity potential of at least one output neuron 24a-c is too high, this is evaluated as a failure. In this case the corresponding failing output neuron 24a-c is set to an activity potential according to the class value of the training set and the transmission function of the synapses is altered so that the now set activity potential of the output neuron 24a-c from the training database 34 corresponds to the sum of all incoming output signals of the connected synapses 26.
After this adaption of the transmission functions of the incoming synapses has been performed, the simulation is again run with the altered transmission functions and again the simulation is performed until the set runtime is lapsed or until at least one the above-mentioned abortion criteria is obtained. Of course, the re-trained result can again be evaluated in the evaluation unit. If the result is evaluated as being in line with the training dataset, the result is output to the output device 34.
The advantage of the inventive analysis system and method is the fact that a result is obtained much sooner than in well-known back propagation networks where either multiple layers are used and/or the result has to be re-fed to an input layer.
The analysis system can be used not only for classification purposes, but also for monitoring of parameter data, particularly if the parameter data are monitored with respect to failure classes. This means that parameters of an industrial plant, a building or of a car or an airplane can be monitored with the invention on known failure classes and thus failures can be detected very soon. As the computational effort of the inventive neuronal network is very low, the data can be processed very fast and online. Accordingly, the invention offers a wide variety of applications in technical failure monitoring as well as in all kinds of classification tasks.
According to standard terminology in ANNs the term "AP" describes a single spike of the membrane voltage of the neuron. The rate or frequency of the APs describes the neuron's activity.
Following terms are used as synonyms: rate - frequency
It is clear for the skilled person that the invention is not restricted to the embodiment of the figure.
List of reference numbers:
10 Analysis system
12 Data input
14 processing logic
16 artificial neuronal network (ANN)
18 input layer of input neurons
20a-d input neurons
22 output layer of output neurons
24a-c output neurons
26 synapses
28 axon
30 axon terminal with transmission function (coupling coefficient)
31 initialization database with correlation coefficients
32 evaluation unit
34 trainings database with class values
36 output device - monitor
Claims
1. Method for the machine-aided optimization of the classification of a number of parameters exhibiting parameter values into classes, to which the parameters are linked by coupling coefficients, in which method an artificial neuronal network (ANN) (16) is used for the classification of the parameters, which ANN having an input layer (18) with a plurality of input neurons (20a-d), the number thereof being selected according to the number of parameters as well as an output layer (22) with output neurons (24a-c), the number thereof being selected according to the number of classes, which input and output neurons (20, 24) are connected via synapses (26) allowing a signal transfer from the input neurons (20a-d) to the output neurons (24a-c), whereby each of the synapses (23a, 23b) includes an adjustable transmission function, whereby the activity (firing rate) of each input and output neuron (20a-d, 24a-c) is characterized by the frequency of a pulse signal generated by it, whereby the AP rate of the input neurons (20a-d) is set according to the parameter values from the data input (12) and whereby the AP rate of the output neurons (24a-c) depends on the sum of all signals obtained from all incoming synapse(s) (23a, 23b), characterized in that a) in initialization step an initialization data base (31) is used which contains the correlation coefficients between the parameters and the classes, in which initialization step the transmission function (coupling coefficients) of all synapses (26) is set according to correlation coefficients from the initialization data base (31), b) in a simulation step the ANN (16) is run, in which the p firing frequency of the output neurons (24a-c) adjusts according to the number and coupling coefficients (transmission functions) of the incoming synapses (26), c) the simulation step is terminated after a set time or after an abortion criterion is fulfilled or after it is stopped manually, and d) from the firing frequency of the output neurons (24a-c) class values are obtained.
2. Method according to claim 1, wherein before step d) the firing frequency of the output neurons (24a-c) is evaluated with training data (34), whereby e) the ANN is retrained if the firing frequency of at least one output neuron (24a-c) fails to meet validity criteria based on the training data (34), in which case the firing frequency at least of the failing output neuron(s) (24a-c) is set to class values from the training data (34), and the transmission function of the incoming synapses (26) to the failing output neuron(s) (24a-c) is altered in a way as to enhance a positive coupling of each input neuron(s) (20a-d) and the failing output neuron(s) (24a-c), and to inhibit a negative coupling thereof, and the method returns to simulation step b).
3. Method according to claim 2, wherein the coupling between an input neuron (20a-d) and an output neuron (24a-c) is positive if their corresponding firing frequencies are both > 0,5, and
wherein the coupling is negative if the firing frequency of the input neuron (20a-d) or the output neuron is smaller than 0,5.
4. Method according to claim 2 or 3, wherein the re-training is done via Hebbian learning.
5. Method according to one of the preceding claims, wherein the abortion criterion is the exceeding of a defined distance between the firing frequency of the two output neurons (24a-c) exhibiting the highest firing frequency.
6. Method according to one of the preceding claims, wherein the abortion criterion is the exceeding of a defined firing frequency threshold value by at least one output neuron (24a-c).
7. Method according to one of the preceding claims, wherein the abortion criterion is the convergence of the firing frequency of the output neurons (24a-c).
8. Method according to one of the preceding claims, wherein for the parameter values normalized values are used, whereby the firing frequency of the input neuron (20a-d) is set according to the normalized parameter value between a defined minimum and a maximum AP rate.
9. Method according to one of the preceding claims, wherein a one-shot learning mechanism is in the initialization step to set the transmission function (coupling coefficients) of all synapses (26) according to correlation coefficients from the initialization data base (31).
10. Method according to one of the preceding claims, wherein for the coupling coefficients normalized values are used, and the transmission function of the synapses (26) are set accordingly between a defined maximum inhibitory effect and a maximum excitatory effect and neutral, preferably in the middle thereof.
11. An analysis system (10) for the machine-aided optimization of the classification of parameter data which parameter data comprise several parameters and corresponding parameter values, as well as classes to which the parameters are to be classified and to which they are linked via coupling coefficients, which analysis system comprises a data input (12) for transferring the parameter data into the analysis system, the analysis system (10) comprises a processing logic (14) with an artificial neuronal network (ANN) (16) having an input layer (18) with several input neurons (20a-d) as well as an output layer (22) with several output neurons (20a-d), whereby the number of input neurons (20a-d) is selectable according to the number of parameters and the number of output neurons (20a-d) is selectable according to the number of classes, whereby the activity (APrate) of each neuron (20a-d, 24a-c) is defined by the frequency of a pulse signal generated by it, whereby the input and output neurons (20a-d, 24a-c) are connected via synapses (26) allowing a signal transfer in the direction of the output neurons (20a-d),
each of the synapses (26) comprising a transmission function which is variable between an inhibitory and excitatory coupling effect, whereby the processing logic (14) is configured to set the AP rate of the input neurons (20a-d) dependent on the parameter values from the data input (12), and whereby the frequency of the pulse signals of each output neuron (20a-d) depends on the sum of signals obtained from all of its incoming synapses (26), characterized in that the analysis system (10) comprises an initialization database (31) comprising the correlation coefficients between the parameters and the classes, and a) which processing logic (14) is configured to set the transmission function (coupling coefficient) of the synapses (26) in an initialization step according to the correlation coefficients from the initialization database (31), b) the processing logic (14) is configured to run the ANN (16) in a simulation step, whereby the firing frequency of each output neuron (24a-c) adjusts according to the number and transmission functions of its incoming synapses (26), c) the processing logic (14) is configured to terminate the simulation step b) after a set time or after an abortion criterion is fulfilled, and d) the processing logic (14) is configured to calculate and output the classification values from the firing frequency of the output neurons (24a-c).
12. The analysis system of claim 11, wherein the processing logic (14) comprises an evaluation unit (32) and a training data base (34) comprising class values of the classes, which evaluation unit (32) is configured to evaluate the validity of the firing frequency of the output neurons (24a-c) by comparison with class values from the training data base (32), whereby the processing logic (14) is configured to set the firing frequency at least of the output neuron(s) (20a-d) failing the validity evaluation to class values from the training database (34), and the transmission function of the incoming synapses (26) to the failing output neuron(s) (20a- d) is altered in a way as to enhance a positive coupling of the input neuron(s) (20a-d) and the failing output neuron(s) (24a-c), and to inhibit a negative coupling thereof, and the processing logic (14) is configured to start the simulation step b) with the altered transmission functions.
13. The analysis system of claim 12, wherein the processing logic (14) is configured to alter the transmission functions according to Hebbian learning.
14. The analysis system of claim 11, 12 or 13, wherein the abortion criterion is the exceeding of a defined distance between the firing frequency of the two output neurons (24a-c) exhibiting the highest firing frequency and/or the exceeding of a defined firing frequency threshold value by at least one output neuron (24a-c) and/or the convergence of the firing frequency of the output neurons (24a-c).
15. The analysis system of one of claims 11 to 14, wherein each neuron (20) comprises an axon (28) having an axon terminal (30) towards the output neuron (24a-c), which axon terminal (30) comprises the transmission function of the synapse (26).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2021/050465 WO2022152364A1 (en) | 2021-01-12 | 2021-01-12 | Method and analysis system for the classification of parameters |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2021/050465 WO2022152364A1 (en) | 2021-01-12 | 2021-01-12 | Method and analysis system for the classification of parameters |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022152364A1 true WO2022152364A1 (en) | 2022-07-21 |
Family
ID=74186688
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2021/050465 WO2022152364A1 (en) | 2021-01-12 | 2021-01-12 | Method and analysis system for the classification of parameters |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022152364A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190197390A1 (en) * | 2017-12-22 | 2019-06-27 | International Business Machines Corporation | Approaching homeostasis in a binary neural network |
WO2020210673A1 (en) * | 2019-04-10 | 2020-10-15 | Cornell University | Neuromorphic algorithm for rapid online learning and signal restoration |
-
2021
- 2021-01-12 WO PCT/EP2021/050465 patent/WO2022152364A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190197390A1 (en) * | 2017-12-22 | 2019-06-27 | International Business Machines Corporation | Approaching homeostasis in a binary neural network |
WO2020210673A1 (en) * | 2019-04-10 | 2020-10-15 | Cornell University | Neuromorphic algorithm for rapid online learning and signal restoration |
Non-Patent Citations (2)
Title |
---|
DOO SEOK JEONG: "Tutorial: Neuromorphic spiking neural networks for temporal learning", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 11 September 2018 (2018-09-11), XP081087228, DOI: 10.1063/1.5042243 * |
TAN CLARENCE ET AL: "Spiking Neural Networks: Background, Recent Development and the NeuCube Architecture", NEURAL PROCESSING LETTERS, KLUWER ACADEMIC PUBLISHERS, NORWELL, MA, US, vol. 52, no. 2, 13 August 2020 (2020-08-13), pages 1675 - 1701, XP037257624, ISSN: 1370-4621, [retrieved on 20200813], DOI: 10.1007/S11063-020-10322-8 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Awodele et al. | Neural networks and its application in engineering | |
Lin et al. | Relative ordering learning in spiking neural network for pattern recognition | |
Wang et al. | SpikeTemp: An enhanced rank-order-based learning approach for spiking neural networks with adaptive structure | |
WO2013085799A2 (en) | Apparatus and methods for implementing learning for analog and spiking signals in artificial neural networks | |
Vazquez | Izhikevich neuron model and its application in pattern recognition | |
Kanal | Perceptron | |
JP6784357B2 (en) | Artificial intelligence ultra-deep learning model construction method, artificial intelligence ultra-deep learning device, general-purpose mobile terminal device equipped with artificial intelligence ultra-deep learning model, and artificial intelligence ultra-deep learning model program | |
US11347221B2 (en) | Artificial neural networks having competitive reward modulated spike time dependent plasticity and methods of training the same | |
Xiao et al. | A hybrid model for time series forecasting | |
Paulauskaite-Taraseviciene et al. | The usage of artificial neural networks for intelligent lighting control based on resident’s behavioural pattern | |
Lasfer et al. | Neural network design parameters for forecasting financial time series | |
WO2022152364A1 (en) | Method and analysis system for the classification of parameters | |
CN116736734A (en) | Intelligent household equipment control method and system based on sensing network | |
Sunny et al. | Artificial Neural Network Modelling of Rossler's and Chua's Chaotic Systems | |
Mollaghasemi et al. | Application of neural networks and simulation modeling in manufacturing system design | |
Lu et al. | On model-guided neural networks for system identification | |
Sun et al. | Simplified spike-timing dependent plasticity learning rule of spiking neural networks for unsupervised clustering | |
CN111582470B (en) | Self-adaptive unsupervised learning image identification method and system based on STDP | |
Hsu | Solving multi-response problems through neural networks and principal component analysis | |
Madey et al. | Integration of neurocomputing and system simulation for modeling continuous improvement systems in manufacturing | |
Mandeh et al. | Data fusion in wireless sensor networks using fuzzy systems | |
JP2021089134A (en) | Machine learning device and environment adjusting device | |
JP2021056893A (en) | Machine learning apparatus and air-conditioning control system | |
Kim et al. | Developing metacognitive models for team-based dynamic environment using fuzzy cognitive mapping | |
Ehteshamullah | Artificial Neural Network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21700541 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21700541 Country of ref document: EP Kind code of ref document: A1 |