CN113673688A - Weight generation method, data processing method and device, electronic device and medium - Google Patents

Weight generation method, data processing method and device, electronic device and medium Download PDF

Info

Publication number
CN113673688A
CN113673688A CN202110976731.XA CN202110976731A CN113673688A CN 113673688 A CN113673688 A CN 113673688A CN 202110976731 A CN202110976731 A CN 202110976731A CN 113673688 A CN113673688 A CN 113673688A
Authority
CN
China
Prior art keywords
neuron
weight
connection
processing core
core
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110976731.XA
Other languages
Chinese (zh)
Inventor
吴臻志
张启坤
张静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Lynxi Technology Co Ltd
Original Assignee
Beijing Lynxi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Lynxi Technology Co Ltd filed Critical Beijing Lynxi Technology Co Ltd
Priority to CN202110976731.XA priority Critical patent/CN113673688A/en
Publication of CN113673688A publication Critical patent/CN113673688A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/061Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using biological neurons, e.g. biological neurons connected to an integrated circuit
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F7/00Methods or arrangements for processing data by operating upon the order or content of the data handled
    • G06F7/58Random or pseudo-random number generators
    • G06F7/588Random number generators, i.e. based on natural stochastic processes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Neurology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present disclosure provides a weight generation method, which is applied to a processing core of a many-core system, wherein the many-core system comprises a plurality of processing cores, the processing core comprises a plurality of neurons, and the weight generation method comprises: generating at least part of connection weights corresponding to a first neuron in the target processing core by using a preset weight generator; wherein the connection weight is a connection weight between the first neuron and the second neuron, and the connection weight is used for performing a membrane potential integration operation of the corresponding first neuron or second neuron. The disclosure also provides a data processing method, a data processing device, a processing core, a many-core system, an electronic device and a computer readable medium.

Description

Weight generation method, data processing method and device, electronic device and medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a weight generation method, a data processing method and apparatus, a processing core, a many-core system, an electronic device, and a computer-readable medium.
Background
In the related art, a large number of connection weights between neurons in a neuromorphic chip of a many-core architecture are usually stored on the chip, and a large storage space is occupied.
Therefore, how to effectively reduce the occupation of on-chip storage resources of a neuromorphic chip with a many-core architecture becomes a technical problem to be solved urgently at present.
Disclosure of Invention
The disclosure provides a weight generation method, a data processing method and device, a processing core, a many-core system, an electronic device and a computer readable medium.
According to a first aspect of the present disclosure, an embodiment of the present disclosure provides a weight generation method, applied to a processing core of a many-core system, where the many-core system includes a plurality of processing cores, where the processing core includes a plurality of neurons, and the weight generation method includes:
generating at least part of connection weights corresponding to a first neuron in the target processing core by using a preset weight generator;
wherein the connection weight is a connection weight between the first neuron and the second neuron, and the connection weight is used for performing a membrane potential integration operation of the corresponding first neuron or second neuron.
In some embodiments, before the generating, by the preset weight generator, at least part of the connection weights corresponding to the first neuron in the target processing core, the method further includes: receiving a weight generation instruction of the target processing core;
the generating, by using a preset weight generator, at least part of the connection weights corresponding to the first neuron in the target processing core includes:
and generating at least part of connection weight corresponding to the first neuron by using a preset weight generator according to the weight generation instruction.
In some embodiments, the processing core comprises an instruction storage space comprising weight generation instructions comprising instruction execution time beats,
wherein the generating, by using a preset weight generator, at least part of the connection weights corresponding to the first neuron in the target processing core includes:
when the instruction execution time beat starts, the weight generation instruction is obtained and executed from the instruction storage space, and at least part of connection weights corresponding to the first neuron are generated by using a preset weight generator.
In some embodiments, the method further comprises: sending at least a portion of the connection weights corresponding to the first neuron to the target processing core.
In some embodiments, before the generating, by the preset weight generator, at least part of the connection weights corresponding to the first neuron in the target processing core, the method further includes:
determining whether all target connection weights corresponding to the first neuron are included in a target storage space;
if not, determining a target connection weight which is not included in the target storage space as at least part of the connection weight corresponding to the first neuron to be generated, wherein the target storage space comprises an on-chip storage space and/or an off-chip memory.
In some embodiments, the method further comprises:
reading the stored target connection weight from the target storage space;
and determining all target connection weights corresponding to the first neuron according to at least part of the corresponding connection weights of the first neuron generated by a preset weight generator and the stored target connection weights read from a target storage space.
In some embodiments, the method further comprises:
applying for a storage space for storing at least part of the connection weights corresponding to the first neuron in a target storage space;
and if the application is successful, storing at least part of the connection weight corresponding to the first neuron in the target storage space.
In some embodiments, the weight generator comprises a first random number generator; the generating, by using a preset weight generator, at least part of the connection weights corresponding to the first neuron in the target processing core includes:
sending first control information to the first random number generator, wherein the first control information contains a connection matrix corresponding to a first neuron in a target processing core, and the connection matrix comprises information for representing whether connection weights need to be generated between the first neuron and each second neuron;
the first random number generator is configured to generate, in response to the first control information, at least part of connection weights corresponding to the first neuron and the second neuron, which need to generate connection weights, according to a first random seed and the connection matrix.
In some embodiments, the weight generator comprises a first random number generator;
the generating, by using a preset weight generator, at least part of the connection weights corresponding to the first neuron in the target processing core includes:
generating a connection matrix corresponding to a first neuron in a target processing core, wherein the connection matrix comprises information for representing whether connection weight generation is needed between the first neuron and each second neuron;
and generating at least part of corresponding connection weights between the first neuron and the second neuron needing to generate the connection weights according to the first random seed and the connection matrix by utilizing the first random number generator.
In some embodiments, the weight generator further comprises a second random number generator connected to the first random number generator;
the generating a connection matrix corresponding to a first neuron in a target processing core comprises:
generating, with the second random number generator, the connection matrix corresponding to a first neuron in a target processing core.
In some embodiments, the generating, with the second random number generator, the connection matrix corresponding to the first neuron in the target processing core comprises:
sending second control information to the second random number generator, the second control information including weight sparsity information corresponding to a first neuron in a target processing core, to cause the second random number generator to perform:
responding to the second control information, and generating the connection matrix with sparsity matched with the weight sparsity information according to a second random seed;
sending the connection matrix corresponding to a first neuron in a target processing core to the first random number generator.
In some embodiments, the generating the connection matrix corresponding to the first neuron in the target processing core comprises:
randomly selecting one second neuron from a plurality of second neurons connected with the first neuron, and determining the second neuron needing to generate connection weight with the first neuron;
under the condition that the number of the selected second neurons is smaller than the target number, continuing to randomly select the next second neuron and determining the second neurons needing to generate connection weights with the first neurons;
and under the condition that the number of the selected second neurons is equal to the target number, generating the connection matrix corresponding to the first neuron in the target processing core according to all the selected second neurons needing to generate connection weights with the first neuron.
In some embodiments, the connection matrix is a matrix of 0 elements and 1 elements, each element corresponding to a first neuron and a second neuron of the target processing core, each column of elements corresponding to a first neuron of the target processing core;
each 0 element is used to characterize the connection weight that needs to be generated between the corresponding first neuron and the second neuron, and each 1 element is used to characterize the connection weight that needs to be generated between the corresponding first neuron and the second neuron.
In some embodiments, the first random number generator comprises a uniform number generator and a distribution transformer connected to the uniform number generator, the distribution transformer having a preset distribution function and corresponding distribution parameters configured therein;
the uniform number generator is used for generating uniform random numbers corresponding to the first neuron and the second neuron which need to generate the connection weight according to the first random seed and the connection matrix, and the uniform random numbers meet a uniform distribution rule;
the distribution transformer is used for transforming the uniform random number into the connection weight meeting a preset distribution function according to the distribution parameters and the preset distribution function.
In some embodiments, the preset distribution function comprises a gaussian distribution function or a poisson distribution function.
In some embodiments, the connection weight corresponding to the first neuron in the target processing core satisfies a preset distribution rule.
In some embodiments, the preset distribution rule comprises a gaussian distribution function or a poisson distribution function.
In some embodiments, in a case where a plurality of the weight generators are configured, for a plurality of the first neurons, the generating, by using a preset weight generator, at least part of the connection weights corresponding to the first neurons in the target processing core includes:
generating at least part of the connection weights corresponding to a plurality of the first neurons in parallel by using a plurality of weight generators, wherein each weight generator is used for generating at least part of the weights corresponding to one first neuron.
In some embodiments, the first neuron comprises a currently firing neuron and/or a neuron having a connection weight with a currently firing second neuron.
According to a second aspect of the present disclosure, an embodiment of the present disclosure provides a weight generation method applied to a processing core of a many-core system, where the many-core system includes a plurality of processing cores, and the processing core includes a plurality of neurons, and the method includes:
receiving at least a portion of the connection weights corresponding to the first neuron sent by the at least one generating core,
wherein at least part of the connection weights corresponding to the first neuron are generated by the generating core by using a preset weight generator, the connection weights are connection weights between the first neuron and the second neuron, and the connection weights are used for performing membrane potential integration operation on the corresponding first neuron or the corresponding second neuron.
According to a third aspect of the present disclosure, an embodiment of the present disclosure provides a data processing method, including: in response to receiving an input pulse, acquiring a connection weight corresponding to a first neuron corresponding to the input pulse in a target processing core by using the weight generation method; the connection weight is a connection weight between the first neuron and the second neuron;
and performing membrane potential integration operation on the first neuron according to the input pulse corresponding to the first neuron and the corresponding connection weight.
According to a fourth aspect of the present disclosure, an embodiment of the present disclosure provides a data processing method, including: under the condition that a first neuron of a target processing core meets a release condition, acquiring a connection weight between the first neuron and a second neuron by using the weight generation method;
and sending input information to the second neuron, wherein the input information comprises an input pulse and a connection weight between the first neuron and the second neuron, so that the second neuron can perform membrane potential integration operation according to the input pulse and the connection weight.
According to a fifth aspect of the present disclosure, an embodiment of the present disclosure provides a weight generation apparatus applied to a processing core of a many-core system, where the many-core system includes a plurality of processing cores, the processing core includes a plurality of neurons, and the weight generation apparatus includes: a control module and a weight generator;
the control module configured to control the weight generator to generate at least part of the connection weights corresponding to the first neuron in the target processing core;
wherein the connection weight is a connection weight between the first neuron and the second neuron, and the connection weight is used for performing a membrane potential integration operation of the corresponding first neuron or second neuron.
According to a sixth aspect of the present disclosure, an embodiment of the present disclosure provides a weight generation apparatus applied to a processing core of a many-core system, where the many-core system includes a plurality of processing cores, and the processing core includes a plurality of neurons, including:
a receiving unit configured to receive at least part of the connection weights corresponding to the first neuron transmitted by the at least one generating core,
wherein at least part of the connection weights corresponding to the first neuron are generated by the generating core by using a preset weight generator, the connection weights are connection weights between the first neuron and the second neuron, and the connection weights are used for performing membrane potential integration operation on the corresponding first neuron or the corresponding second neuron.
According to a seventh aspect of the present disclosure, an embodiment of the present disclosure provides a data processing apparatus, including: a weight acquisition unit configured to acquire, in response to receiving an input pulse, a connection weight corresponding to a first neuron corresponding to the input pulse in a target processing core by the weight generation device as described above; the connection weight is a connection weight between the first neuron and the second neuron;
an integration issuing unit configured to perform a membrane potential integration operation on the first neuron according to the input pulse corresponding to the first neuron and the corresponding connection weight.
According to an eighth aspect of the present disclosure, an embodiment of the present disclosure provides a data processing apparatus, including: a weight acquisition unit configured to acquire, in a case where a first neuron of a target processing core satisfies a firing condition, a connection weight between the first neuron and a second neuron using the weight generation device as described above;
an integration issuance unit configured to send input information to the second neuron, the input information including an input pulse and a connection weight between the first neuron and the second neuron, for the second neuron to perform a membrane potential integration operation according to the input pulse and the connection weight.
According to a ninth aspect of the present disclosure, an embodiment of the present disclosure provides a processing core, which includes a weight generation device and/or a data processing device, where the weight generation device includes the above weight generation device, and the data processing device includes the above data processing device.
According to a tenth aspect of the disclosure, an embodiment of the disclosure provides a many-core system, which includes a plurality of processing cores, and at least one of the processing cores adopts the processing core.
According to an eleventh aspect of the present disclosure, an embodiment of the present disclosure provides an electronic device, including: a plurality of processing cores; and a network on chip configured to interact data among the plurality of processing cores and external data; one or more instructions are stored in one or more processing cores, and the one or more instructions are executed by the one or more processing cores, so that the one or more processing cores can execute the weight generation method or the data processing method.
According to a twelfth aspect of the present disclosure, an embodiment of the present disclosure provides a computer readable medium, on which a computer program is stored, wherein the computer program, when executed by a processing core, implements the weight generation method or the data processing method described above.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The accompanying drawings are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the principles of the disclosure and not to limit the disclosure. The above and other features and advantages will become more apparent to those skilled in the art by describing in detail exemplary embodiments thereof with reference to the attached drawings, in which:
FIG. 1 is a schematic diagram of a many-core system according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a processing core of FIG. 1;
fig. 3 is a flowchart of a weight generation method provided by an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a connection relationship between a first neuron of a target processing core B and a second neuron of a processing core A;
FIG. 5 is a schematic diagram of a weight generator according to an embodiment of the disclosure;
FIG. 6 is a block diagram of a first random number generator according to an embodiment of the present disclosure;
fig. 7 is a flowchart of a method for generating weights according to an embodiment of the present disclosure;
fig. 8 is a flowchart of a data processing method provided by an embodiment of the present disclosure;
FIG. 9 is a flow chart of another data processing method provided by the embodiments of the present disclosure;
fig. 10 is a block diagram illustrating a weight generation apparatus according to an embodiment of the present disclosure;
fig. 11 is a block diagram of a data processing apparatus according to an embodiment of the present disclosure;
FIG. 12 is a block diagram illustrating another data processing apparatus according to an embodiment of the present disclosure;
fig. 13 is a block diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
To facilitate a better understanding of the technical aspects of the present disclosure, exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, wherein various details of the embodiments of the present disclosure are included to facilitate an understanding, and they should be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Embodiments of the present disclosure and features of embodiments may be combined with each other without conflict.
As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the present disclosure, and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The brain-like calculation constructs an electronic brain similar to a biological brain by simulating an operation mechanism of the biological brain, and is used for data processing with higher accuracy and processing efficiency. Architectures based on von neumann architectures have limited support for brain-like computing, limiting the range of applications for neural networks and are less energy efficient, so neuromorphic chips (e.g., brain-like computing chips) have come into force in order to obtain new architectures that more closely match neural computing, which are typically crowdsourced structures.
Fig. 1 is a schematic structural diagram of a many-core system according to an embodiment of the present disclosure, and fig. 2 is a schematic structural diagram of a processing core in fig. 1, where the many-core system includes a plurality of processing cores, each processing core generally includes a group (a plurality) of neurons, a group of axons, a group of dendrites, and a synapse array, and may simulate a behavior of a biological neuron cluster, and each processing core in the many-core system may complete a dendrite integral calculation process and a soma calculation process of a corresponding group of neurons. The dendrite is a structure for receiving external stimulation and generating input current, the axon is a structure for transmitting a pulse to synapses of a subsequent neuron, the synapses are structures for connecting the neurons, and the synapse array is an array consisting of synapses and connects a group of axons and a group of dendrites.
For each neuron within a processing core, a dendrite integral computation process, also referred to in the disclosed embodiments as a membrane potential integration operation, is used to describe the process of integrating the pulse data of all input axons connected to the dendrite of that neuron. The cell operation process is responsible for updating the membrane potential of the neuron and judging the issuance of the pulse, the cell operation process is also called issuance operation in the embodiment of the disclosure, if the membrane potential of any neuron meets the issuance condition, the axon of the neuron issues the pulse represented by 1 to a subsequent neuron connected with the neuron; otherwise, the axon of the neuron fires a pulse denoted by 0 to a subsequent neuron connected to the neuron. This allows many-core systems, as very large scale integrated systems with electronic analog circuitry, to more accurately simulate neurobiological structures in the nervous system.
In the related art, a synapse array is generally used to simulate a connection relationship between a group of neurons of one processing core and a group of neurons of another processing core, as an array storage structure for storing a connection topology and a connection weight (also referred to as a synapse weight) between the group of neurons of one processing core and the group of neurons of another processing core. However, the weight matrix correspondingly stored in the synapse array usually contains a large number of connection weights between neurons, and occupies a large storage space of the processing core. In some application scenarios, weight parameters need to be transmitted between neurons for operation in an operation process, and more hardware processing resources are occupied, so that the improvement of the hardware processing performance of a processing core is limited to a certain extent, that is, the improvement of the performance of a many-core system is limited.
In order to effectively solve the technical problems in the related art, the embodiments of the present disclosure provide a weight generation method, a data processing device, a processing core, a many-core system, an electronic device, and a computer-readable medium. Meanwhile, in some application scenarios, weight parameters do not need to be transmitted among the neurons in the operation process, but the corresponding weight parameters are generated along with the operation process of the neurons, so that the occupation of hardware processing resources is effectively reduced, the hardware processing performance of the processing core is improved to a certain extent, and the performance of the many-core system is improved.
The technical solutions of the embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
Fig. 3 is a flowchart of a weight generation method provided by an embodiment of the present disclosure, and referring to fig. 3, an embodiment of the present disclosure provides a weight generation method, where the method is applied to a processing core of a many-core system, the method may be performed by a weight generation apparatus, the apparatus may be implemented in a software and/or hardware manner, the apparatus may be integrated in the processing core of the many-core system, and the weight generation method includes:
step S1, generating at least a part of connection weights corresponding to the first neuron in the target processing core by using a preset weight generator.
Wherein the connection weight is a connection weight between the first neuron and the second neuron, and the connection weight is used for performing a membrane potential integration operation of the corresponding first neuron or second neuron.
The target processing core may be any processing core in a many-core system, for example, may be a current processing core (or referred to as a generating core) that executes a weight generation method, and the current processing core may generate at least part of the connection weights corresponding to the first neurons included in the current processing core by using the weight generator. The target processing core may also be any processing core other than the current processing core, and the current processing core may generate at least part of the connection weights corresponding to the first neuron included in the other processing core by using the weight generator, which is not limited by the present disclosure.
The second neuron may be a neuron connected to the first neuron in any processing core, which may be in the same processing core as the first neuron or in a different processing core from the first neuron, which is not limited by the present disclosure. The connection relationship between the first neuron and the second neuron may be preset, for example, a connection matrix preset with information indicating whether the first neuron and the second neuron need to generate connection weights is preset, and the connection matrix may be stored in the generating core or may be received by the generating core. The connection relationship between the first neuron and the second neuron may also be generated online, for example, according to a random number generator and a random seed, which is not limited by the present disclosure.
In embodiments of the present disclosure, the first neuron may be a successor neuron or a successor neuron of the second neuron, which is not limited by the present disclosure. The first neuron is a neuron which is a precursor of the second neuron, and it is understood that when the first neuron fires, a firing pulse is transmitted to the second neuron. The first neuron, which is a successor to the second neuron, may be understood as the first neuron receives the firing pulse delivered by the firing second neuron.
In the embodiment of the present disclosure, each first neuron may correspond to at least one second neuron, and each second neuron may also correspond to at least one first neuron, and the target processing core may include at least one first neuron, and the present disclosure does not limit the number of first neurons, the number of second neurons, and the corresponding relationship between the first neurons and the second neurons.
The first neuron may be a first neuron which needs to perform a membrane potential integration operation, or the second neuron having a connection weight with the first neuron may be a neuron which needs to perform a membrane potential integration operation, or may be any first neuron in the processing core. It should be understood that for any one processing core, determining all connection weights for each first neuron included therein may determine the synapse array of that processing core.
In some embodiments, the first neuron comprises a currently firing neuron and/or a neuron having a connection weight with a currently firing second neuron.
In the embodiment of the present disclosure, when being applied to a sparse computation scenario, a first neuron may generate a connection weight corresponding to the first neuron for a currently issued neuron and/or a neuron having a connection weight with a currently issued second neuron, which may meet a requirement of a membrane potential integration operation, and it is not necessary to generate connection weights of all neurons in a processing core, which may reduce data amount of weight generation and improve weight generation efficiency.
The at least part of the connection weight corresponding to the first neuron may be at least part of the connection weight required by the current membrane potential integration operation, or may be at least part of all the connection weights corresponding to the first neuron, which is not limited in this disclosure.
The connection weight may be any value, and the disclosure is not limited thereto.
In the embodiment of the present disclosure, for a first neuron in a target processing core, a weight matrix required for operation of the first neuron does not need to be pre-stored by a synapse array, and a preset weight generator generates a connection weight between the first neuron and a connected second neuron, so as to save a storage space of the processing core, and in some application scenarios, when a pulse is transmitted to the first neuron of the target processing core or when the first neuron of the target processing core transmits a pulse to the second neuron, a relevant weight parameter does not need to be transmitted, so that occupation of hardware processing resources is effectively reduced, and hardware processing performance of the processing core is improved to a certain extent, that is, performance of a many-core system is improved.
In some optional embodiments, the target processing core is any processing core except the current processing core, and the disclosure does not limit how the current processing core generates at least part of the connection weight corresponding to the first neuron in the target processing core by using the preset weight generator. For example, the current processing core may generate at least part of the connection weight corresponding to the first neuron in the target processing core by using the weight generator according to the weight generation instruction sent by the target processing core or a pre-stored weight generation instruction. The weight generation instruction is used for instructing the current processing core to generate at least part of the connection weight corresponding to the first neuron in the target processing core, and the form and content of the weight generation instruction are not limited by the disclosure.
In one possible implementation, the weight generation instruction may include, but is not limited to: the method comprises the steps of core position of a target processing core, a first random seed, a distribution function type, the number of connection weights needing to be generated, a connection matrix and instruction execution time beat.
The first random seed is used to generate connection weights, the distribution function type is used to determine a distribution function (distribution rule) that the generated connection weights need to satisfy, and the connection matrix may include information used to characterize whether connection weights need to be generated between the first neuron and each second neuron in the target processing core, that is, the target processing core may specify which neurons need to generate connection weights and the number of connection weights that need to be generated through the connection matrix. The instruction execution time beat is used for representing the time beat for executing the weight generation instruction. When the instruction string comprises a plurality of weight generation instructions, the weight generation instructions can be sequentially stored in the instruction storage space according to the order of instruction execution time beat to form an instruction string.
In some embodiments, the weight generation instruction may further include a memory location of the target processing core, so that the current processing core caches at least part of the generated connection weight to a memory space corresponding to the memory location after generating the connection weight corresponding to the first neuron of the target processing core.
In some embodiments, the weight generation instruction may further include weight sparsity information and a second random seed of the required generated connection weights, the weight sparsity information and the second random seed being used to generate the required connection matrix.
In some optional embodiments, before step S1, the weight generation method may further include:
receiving a weight generation instruction of the target processing core;
the generating, by using a preset weight generator, at least part of the connection weights corresponding to the first neuron in the target processing core may include:
and generating at least part of connection weight corresponding to the first neuron by using a preset weight generator according to the weight generation instruction.
In some embodiments, when the target processing core needs to use the weight parameter, a weight generation instruction may be sent to one or more generating cores for generating the connection weight, each generating core generating at least part of the connection weight required by the target processing core in response to the weight generation instruction of the target processing core.
The time when the weight generation instruction of the target processing core is received may be an instruction execution time, or may be earlier than the instruction execution time, which is not limited in this disclosure.
In this way, at least part of the connection weight can be generated according to the weight generation instruction, and the intra-core code space and the storage space of the target processing core can be saved.
In some optional embodiments, the processing core comprises an instruction storage space comprising weight generation instructions, the weight generation instructions comprising instruction execution time beats,
the generating, by using a preset weight generator, at least part of the connection weights corresponding to the first neuron in the target processing core may include:
when the instruction execution time beat starts, the weight generation instruction is obtained and executed from the instruction storage space, and at least part of connection weights corresponding to the first neuron are generated by using a preset weight generator.
The weight generation instruction in the instruction storage space of the processing core may be stored after receiving the weight generation instruction sent by the target processing core in advance of the instruction execution time, or may be prestored in the instruction storage space and sent without the target processing core, which is not limited in this disclosure.
In the embodiment of the disclosure, when the instruction execution time beat starts, the corresponding weight generation instruction is acquired from the instruction storage space and executed to generate at least part of the connection weight, so that the data interaction frequency between the processing cores can be reduced, and the system delay caused by the data interaction is reduced.
In some embodiments, after step S1, the weight generation method may further include:
sending at least a portion of the connection weights corresponding to the first neuron to the target processing core.
Illustratively, at least a portion of the connection weights corresponding to the first neuron may be sent to the target processing core via a network on chip.
In the embodiment of the present disclosure, at least part of the connection weights corresponding to the first neurons in the target processing core other than the current processing core may be generated, and the generation of the connection weights corresponding to each first neuron in the many-core system may be ensured by including a weight generator in part of the processing cores in the many-core system, so that the intra-core code space and the storage space of part of the processing cores may be saved.
In some optional embodiments, the target processing core may store a portion of the connection weight in a target storage space, where the target storage space may include an on-chip memory space (e.g., a storage space within the processing core) and/or an off-chip memory, such as a Double Data Rate SDRAM (DDR), for which the target processing core may be obtained by reading the on-chip memory space or the off-chip memory, and another portion of the connection weight required by the target processing core (e.g., a portion of the connection weight where the target storage space does not exist) may be generated online.
The connection weight may be generated in advance and stored in an on-chip memory space (for example, a memory space in the processing core) or an off-chip memory, or may be generated online and stored, which is not limited in this disclosure.
The processing core that reads the target storage space to obtain at least part of the weight parameters may be a current processing core (generating core) that executes the weight generation method, and the at least part of the weight parameters may be connection weights corresponding to the first neuron included in the processing core (generating core), or connection weights corresponding to the first neuron in other processing cores (target processing cores). It should be understood that, when the target processing core is a non-generating core, the target storage space may also be read to obtain the connection weight corresponding to the first neuron included in the target processing core, which is not limited by the present disclosure.
It should be understood that the generating of at least part of the connection weights corresponding to the first neuron in the target processing core by using the preset weight generator may be generating at least part of the connection weights corresponding to the first neuron which is not already stored in the target storage space, or generating at least part of the connection weights corresponding to the weight generation instruction in response to the weight generation instruction transmitted by the target processing core.
In some optional embodiments, before step S1, the method further comprises: determining whether all target connection weights corresponding to the first neuron are included in a target storage space;
if not, determining the target connection weight which is not included in the target storage space as at least part of the connection weight corresponding to the first neuron to be generated.
The target connection weight is a connection weight that the target processing core needs to use in the current operation process, and may be all connection weights corresponding to the first neuron or a part of connection weights corresponding to the first neuron.
In this way, at least a portion of the connection weights corresponding to the first neuron to be generated may be determined for generation by the weight generator to determine all of the target connection weights.
In some optional embodiments, the weight generation method may further include:
reading the stored target connection weight from the target storage space;
and determining all target connection weights corresponding to the first neuron according to at least part of the corresponding connection weights of the first neuron generated by a preset weight generator and the stored target connection weights read from a target storage space.
In this way, for the stored target connection weight, the target connection weight can be obtained by reading the target storage space, and for the target connection weight which is not stored, the target connection weight can be generated on line, so that the frequency of generating the weight and the amount of generated data can be reduced.
In some optional embodiments, the weight generation method may further include:
applying for a storage space for storing at least part of the connection weights corresponding to the first neuron in a target storage space;
and if the application is successful, storing at least part of the connection weight corresponding to the first neuron in the target storage space.
In some optional embodiments, after generating at least a portion of the connection weights required by the target processing core, an attempt may be made to apply for storing the at least a portion of the connection weights to the on-chip memory space and/or the off-chip memory. When the remaining storage space of the on-chip memory space of the target processing core is insufficient or the bandwidth of the off-chip memory is insufficient, the generated at least part of the connection weight is refused to be cached continuously.
In some embodiments, after the at least part of the connection weight is cached in the on-chip memory space or the off-chip memory of the target processing core, the at least part of the connection weight may be overwritten by other temporary data storage generated subsequently, and the at least part of the connection weight is marked as invalid. When the target processing core needs to use the at least part of the connection weight for running, whether the at least part of the connection weight is generated or not can be checked, the at least part of the connection weight is still valid, if the at least part of the connection weight is satisfied, the target storage space is determined to comprise the at least part of the connection weight, the existing at least part of the connection weight can be directly used, and otherwise, the at least part of the connection weight is generated online again.
In this way, at least part of the connection weight can be stored in the target storage space, and when the connection weight needs to be used, the connection weight can be directly read from the target storage space without executing the generation operation each time.
In the embodiment of the present disclosure, the brain simulation mainly simulates physical characteristics, which means that the attention of specific values of weights is low, and the attention of the variation trend of the weights is high, so that, in order to make the generated connection weights conform to the expected distribution trend, the embodiment of the present disclosure sets the distribution constraint condition to constrain the weight values generated by the weight generator, so that the connection weights corresponding to the first neurons generated by the weight generator can satisfy the preset distribution rule, and the preset distribution rule may include the preset distribution function, so that the connection weights of the neurons generated by the weight generator can conform to the expected distribution requirement. In some embodiments, the preset distribution function may include a gaussian distribution (normal distribution) function or a poisson distribution function.
In some embodiments, a connection matrix corresponding to a first neuron of the target processing core is set in advance according to weight sparsity information corresponding to the first neuron of the target processing core, the connection matrix includes information for characterizing whether connection weights need to be generated between the first neuron and each second neuron in the target processing core, each element in the connection matrix is used for characterizing whether connection weights need to be generated between each first neuron and one second neuron in the target processing core, the weight sparsity information is used for controlling the number of elements in the connection matrix for characterizing that the first neuron and the second neuron need to generate connection weights, and controlling the number of elements in the connection matrix for characterizing that the first neuron and the second neuron do not need to generate connection weights.
In some embodiments, the connection matrix is a matrix of 0 elements and 1 elements, each element corresponding to a first neuron and a second neuron of the target processing core, and each column element corresponding to a first neuron of the target processing core. Wherein each 0 element is used to characterize a connection weight that need not be generated between the corresponding first neuron and the second neuron, and each 1 element is used to characterize a connection weight that need to be generated between the corresponding first neuron and the second neuron. The weight sparsity information may be understood as a proportion of 1 element or a proportion of 0 element in the connection matrix.
For ease of understanding, the following is exemplary with the first neuron being located at target processing core B and the second neuron being located at processing core a, with the understanding that the disclosure is not so limited.
Fig. 4 is a schematic diagram of a connection relationship between a first neuron of a target processing core B and a second neuron of a processing core a, the target processing core B including, as an example, a first neuron B1, a first neuron B2, and a first neuron B3, the processing core a including a second neuron a1, a second neuron a2, a second neuron A3, and a second neuron a4, each first neuron in the target processing core B having a connection relationship with each second neuron of the processing core a, i.e., a full connection relationship is between the first neuron of the target processing core and the second neuron of the processing core a. The connection weights do not need to be generated between the first neuron B1 and the second neuron A2, the connection weights need to be generated between the first neuron B1 and the second neurons A1, A3 and A4, the connection weights need not to be generated between the first neuron B2 and the second neuron A1, the connection weights need to be generated between the first neuron B2 and the second neurons A2, A3 and A4, the connection weights need not to be generated between the first neuron B3 and the second neurons A3 and A4, and the connection weights need to be generated between the first neuron B3 and the second neurons A1 and A2. Then for the first neuron B1 of the target processing core B, its corresponding connection weights include the connection weight a11 with the connected second neuron a1, the connection weight a13 with the connected second neuron A3, the connection weight a14 with the connected second neuron a 4; for the first neuron B2 of the target processing core B, its corresponding connection weights include connection weight a22 with connected second neuron a2, connection weight a23 with connected second neuron A3, connection weight a24 with connected second neuron a 4; for the first neuron B3 of the target processing core B, its corresponding connection weights include the connection weight a31 with the connected second neuron a1, and the connection weight a32 with the connected second neuron a 2.
From the connection relationship shown in fig. 4, it can be seen that the connection matrix corresponding to the first neuron B1 of the target processing core B is [1, 0, 1, 1], the connection matrix corresponding to the first neuron B2 of the target processing core B is [0, 1, 1, 1], the connection matrix corresponding to the first neuron B3 of the target processing core B is [1, 1, 0, 0], the connection matrix corresponding to the target processing core B is a matrix consisting of 0 and 1 and having 4 rows and 3 columns, and each column element corresponds to one first neuron of the target processing core B.
In some embodiments, connection weights may need to be generated between the first neuron of the target processing core B and each second neuron of the processing core a, and the corresponding connection matrix is a matrix of all 1 elements.
In some embodiments, to reduce the amount of computation, there may be a partial need to generate connection weights between the first neuron of the target processing core B and the second neuron of the processing core a, i.e., the connection weights between the first neuron of the target processing core B and the second neuron of the processing core a have a certain sparsity. For example, the first neuron B1 and the second neurons a2, A3 shown in fig. 4 do not need to generate connection weights, the first neuron B2 and the second neurons a1, A3, a4 do not need to generate connection weights, the first neuron B3 and the second neurons a1, a2, a4 do not need to generate connection weights, the corresponding connection weights between the first neuron and the second neuron that do not need to generate connection weights will be disabled or set to 0, and the corresponding connection weights between the first neuron and the second neuron that need to generate connection weights will be preserved. Accordingly, the element of the connection matrix corresponding to the neuron which does not need to generate the connection weight is 0 element, and the element corresponding to the neuron which needs to generate the connection weight is 1 element.
In some embodiments, in a case where the connection matrix corresponding to the first neuron of the target processing core is preset, the weight generator may include a first random number generator, and the step S1 may further include: first control information is sent to a first random number generator.
In this embodiment, the first control information includes a connection matrix corresponding to a first neuron in the target processing core, and the connection matrix includes information for characterizing whether generation of connection weights is required between the first neuron and each second neuron in the target processing core.
In this embodiment, the first random number generator is configured to generate, in response to the first control information, a corresponding connection weight between the first neuron and the second neuron requiring generation of the connection weight, based on the first random seed and the connection matrix. In other words, for a first neuron and a second neuron connected to a corresponding 1 element in the matrix, the first random number generator generates a corresponding connection weight between the first neuron and the second neuron corresponding to the 1 element according to the first random seed.
In this embodiment, the first random seed information may be a preset random seed, the first random seed may be preset and stored in the first random number generator, and the first random number generator may generate a random number sequence according to the preset first random seed, and multiply the random number sequence and the elements of the connection matrix that are located at the same row number and the same column number, so that an element of the random number sequence corresponding to an element 0 in the connection matrix will be set to be 0, and an element of the random number sequence corresponding to an element 1 in the connection matrix will remain unchanged, thereby obtaining at least part of the connection weights corresponding to the first neuron and the second neuron that need to generate the connection weights.
In this embodiment, in order to generate the connection weights meeting the expected distribution requirement, a distribution constraint condition may be further preset in the first random number generator, where the distribution constraint condition may include a preset distribution function and a distribution parameter, and the distribution constraint is performed when the connection weights are generated, so that the connection weights corresponding to the first neuron generated by the first random number generator may meet the preset distribution function, that is, a preset distribution rule. In some embodiments, the preset distribution function may include a gaussian distribution (normal distribution) function or a poisson distribution function.
Wherein the random number sequence generated each time by the first random number generator is the same according to the same first random seed.
In some embodiments, in a case where the connection matrix corresponding to the first neuron of the target processing core is not preset, the step of generating the connection weight corresponding to the first neuron in the target processing core by using the preset weight generator may further include: the method comprises the steps of generating a connection matrix corresponding to a first neuron in a target processing core, and generating at least part of corresponding connection weights between the first neuron and a second neuron needing to generate the connection weights according to a first random seed and the connection matrix by utilizing a first random number generator. In this embodiment, the description of the connection matrix and the first random number generator can be referred to the above related description, and will not be described herein.
The step of generating the connection matrix and the step of generating the connection weights may be performed such that when an element in the generated connection matrix is 1, the operation of generating the corresponding connection weight is performed, and when the element in the generated connection matrix is 0, the operation of generating the connection weight is not performed. When all elements in the connection matrix are generated, all connection weights corresponding to the connection matrix may be generated. The present disclosure is not so limited.
Fig. 5 is a schematic structural diagram of a weight generator in an embodiment of the present disclosure, and in some embodiments, in a case where a connection matrix corresponding to a first neuron of a target processing core is not preset, as shown in fig. 5, the weight generator may include, as an example, a first random number generator and a second random number generator connected to the first random number generator, the second random number generator being configured to generate a connection matrix corresponding to the first neuron of the target processing core, and the first random number generator being configured to generate at least part of corresponding connection weights between the first neuron and a second neuron which need to generate the connection weights.
In this embodiment, the step of generating the connection matrix corresponding to the first neuron in the target processing core may further comprise: and generating a connection matrix corresponding to the first neuron in the target processing core by using the second random number generator, so that the first random number generator generates at least part of corresponding connection weights between the first neuron and the second neuron which need to generate the connection weights according to the first random seed and the connection matrix.
In this embodiment, the step of generating, with the second random number generator, the connection matrix corresponding to the first neuron in the target processing core may further comprise: second control information is sent to a second random number generator.
In this embodiment, the second control information includes weight sparsity information corresponding to the first neuron in the target processing core.
In this embodiment, the second random number generator is configured to generate, in response to the second control information, a connection matrix whose sparsity matches the weight sparsity information according to the second random seed, and send the connection matrix corresponding to the first neuron in the target processing core to the first random number generator.
In this embodiment, the second random seed may include a preset random seed, the second random seed may be preset and stored in the second random number generator, the second random number generator may generate a connection matrix whose sparsity matches the weight sparsity information according to the preset second random seed and the weight sparsity information, and the sparsity of the connection matrix refers to a proportion occupied by 1 element or a proportion occupied by 0 element. The weight sparsity information may include a weight sparsity corresponding to a first neuron in the target processing core and a connection relationship between the first neuron and a second neuron in the target processing core, where the weight sparsity is a proportion occupied by a connection relationship requiring generation of a connection weight or a proportion occupied by a connection relationship not requiring generation of a connection weight in all connection relationships.
Wherein, the connection matrix of the sparsity matched with the weight sparsity information means: the proportion of 1 element in the connection matrix is the same as the proportion of the connection relation in the weight sparsity information for which the connection weight needs to be generated, or the proportion of 0 element in the connection matrix is the same as the proportion of the connection relation in the weight sparsity information for which the connection weight does not need to be generated.
Wherein the random number sequence generated each time by the second random number generator is the same according to the same second random seed.
In this embodiment, the second random seed information may include a preset random seed, the second random seed information may be different from the first random seed information, and may be the same as or different from the initial random seed information, the second random seed information may be preset and stored in a second random number generator, and the second random number generator may generate at least part of the connection weights corresponding to the first neuron and the second neuron having a connection relationship according to the preset random seed and the connection matrix. In other words, for the first neuron and the second neuron corresponding to the 1 element in the connection matrix, the second random number generator generates at least part of the connection weight corresponding to the 1 element between the first neuron and the second neuron according to the second random seed information.
In this embodiment, the description that the second random number generator may generate at least part of the connection weights corresponding to the first neuron and the second neuron having the connection relationship according to the preset random seed and the connection matrix may specifically refer to the description that the weight generator generates at least part of the connection weights corresponding to the first neuron and the second neuron having the connection relationship according to the initial random seed information and the connection matrix, which is not described herein again.
In some embodiments, the step of generating the connection matrix corresponding to the first neuron in the target processing core may further comprise: randomly selecting one second neuron from a plurality of second neurons connected with the first neuron, and determining the second neuron needing to generate connection weight with the first neuron; under the condition that the number of the selected second neurons is smaller than the target number, continuously and randomly selecting the next second neuron, and determining the second neurons needing to generate connection weights with the first neurons; and under the condition that the number of the selected second neurons is equal to the target number, generating a connection matrix corresponding to the first neuron in the target processing core according to all the selected second neurons needing to generate connection weights with the first neuron.
The number of times of randomly selecting the second neurons can be larger than or equal to the target number, and all the selected second neurons needing to generate the connection weight with the first neuron are different second neurons.
Illustratively, the initial value of the counter may be configured to be zero and the target value may be configured to be a target number. Specifically, each of a plurality of second neurons connected to the first neuron has a neuron identifier, which can be represented in a digital form, for example, 1, 2, 3, 4 …, etc., and a neuron identifier of one second neuron is randomly generated and determined as a second neuron which needs to generate a connection weight with the first neuron, and the count of the counter is increased by 1. And in the case that the count value of the counter is smaller than the value of the target number, continuing to randomly generate a neuron identification of a second neuron, determining the second neuron as the second neuron needing to generate the connection weight with the first neuron, and simultaneously adding 1 to the count of the counter. When the count value of the counter is equal to the target number of values, a connection matrix corresponding to the first neuron in the target processing core is generated based on all selected second neurons for which connection weights need to be generated between the selected second neurons and the first neuron, for example, a matrix element corresponding to the second neuron which needs to generate a connection weight between the selected second neurons and the first neuron is set to 1, and a matrix element corresponding to the second neuron which does not need to generate a connection weight between the selected first neurons and the selected first neurons is set to 0, thereby generating a connection matrix corresponding to the first neuron in the target processing core.
It should be noted that, when the neuron identifier of the currently randomly selected second neuron is repeated with the neuron identifier of the previously selected second neuron, the neuron identifier of the currently selected second neuron is ignored, and the neuron identifier of the next second neuron is continuously randomly selected.
Fig. 6 is a schematic structural diagram of a first random number generator according to an embodiment of the disclosure, and in some embodiments, as shown in fig. 6, the first random number generator may include a uniform number generator and a distribution transformer connected in series with the uniform number generator, a uniform distribution function and corresponding uniform distribution parameters are preset in the uniform number generator, and a preset distribution function and corresponding distribution parameters are configured in the distribution transformer.
The uniform number generator is used for generating uniform random numbers corresponding to the first neuron and the second neuron which have a connection relation according to the first random seed and the connection matrix, and the generated uniform random numbers meet a uniform distribution rule.
The distribution transformer is used for transforming the uniform random numbers which accord with the uniform distribution into connection weights which accord with a preset distribution rule, and is specifically used for transforming the uniform random numbers which accord with the uniform distribution into connection weights which meet the preset distribution function, namely the connection weights which meet the preset distribution rule according to a preset distribution function and corresponding distribution parameters. Wherein the preset distribution function may include a gaussian distribution function or a poisson distribution function.
In this embodiment, the uniform number generator may include at least one Linear Feedback Shift Register (LFSR).
In this embodiment, the distribution transformation mode of the distribution transformer is different according to different preset distribution functions, and the distribution transformer may perform the distribution transformation through an algorithm or a table look-up method.
In some embodiments, in a case where a plurality of the weight generators are configured, for a plurality of the first neurons, the generating, by using a preset weight generator, at least part of the connection weights corresponding to the first neurons in the target processing core includes:
generating at least part of the connection weights corresponding to a plurality of the first neurons in parallel by using a plurality of weight generators, wherein each weight generator is used for generating at least part of the connection weights corresponding to one first neuron.
In this embodiment, if a membrane potential integration operation is performed on a plurality of first neurons or second neurons, each of which performs an operation using at least one corresponding connection weight, the corresponding connection weights of the plurality of first neurons may be generated in parallel by using a corresponding plurality of weight generators.
In this embodiment, for the description of each weight generator generating the connection weight corresponding to the first neuron, reference may be made to the above description, and details are not described here. It is to be appreciated that multiple weight generators can be multiplexed to generate connection weights for a greater number of first neurons.
In this way, the efficiency of generating the connection weight can be improved.
Fig. 7 is a flowchart of a weight generation method provided by an embodiment of the present disclosure, and referring to fig. 7, an embodiment of the present disclosure provides a weight generation method, where the method is applied to a processing core of a many-core system, the method may be performed by a weight generation apparatus, the apparatus may be implemented in a software and/or hardware manner, the apparatus may be integrated in the processing core of the many-core system, and the weight generation method includes:
step S11, receiving at least a part of the connection weight corresponding to the first neuron sent by the at least one generating core.
Wherein at least part of the connection weights corresponding to the first neuron are generated by the generating core by using a preset weight generator, the connection weights are connection weights between the first neuron and the second neuron, and the connection weights are used for performing membrane potential integration operation on the corresponding first neuron or the corresponding second neuron.
The generating core may be the processing core described above, which may generate at least part of the connection weights corresponding to the first neuron by using a preset weight generator, and the number of the generating cores may be one or more.
For example, when the processing core needs a large number of weight parameters in a short time, at least part of the connection weights corresponding to the first neuron generated by the multiple generating cores may be received respectively to quickly determine the target connection weight corresponding to the first neuron.
As mentioned above, each generating core may generate and transmit at least a portion of the connection weights corresponding to the first neuron in response to the received weight generating instruction or according to a pre-stored weight generating instruction, which is not described herein again.
In some optional embodiments, the method further comprises:
determining whether all target connection weights corresponding to the first neuron are included in a target storage space;
if not, determining a weight generation instruction according to the target connection weight which is not included in the target storage space;
sending the weight generation instruction to at least one of the generating cores, wherein the target storage space comprises an on-chip storage space and/or an off-chip memory.
The weight generation instruction is used for generating a target connection weight corresponding to the first neuron, which is not included in the target storage space, and the number of the weight generation instructions may be multiple, each weight generation instruction may correspond to one generation core, and the weight generation instructions may be sent to the corresponding generation cores through the network on chip. The present disclosure does not limit the target connection weight corresponding to the first neuron, which is not included in the target storage space, as long as the generating core can generate the target connection weight according to the weight generation instruction.
Fig. 8 is a flowchart of a data processing method provided by an embodiment of the present disclosure, and referring to fig. 8, the embodiment of the present disclosure provides a data processing method, which is applied to a processing core of a many-core system, and the method may be performed by a data processing apparatus, the apparatus may be implemented by software and/or hardware, and the apparatus may be integrated in the processing core of the many-core system, and the data processing method includes:
step S21, in response to receiving an input pulse, acquiring a connection weight corresponding to a first neuron corresponding to the input pulse in the target processing core by using the weight generation method; the connection weight is a connection weight between the first neuron and the second neuron.
The second neuron is a precursor neuron of the first neuron, and the input pulse is a release pulse sent by the second neuron to the first neuron in the target processing core.
In some optional embodiments, the second neuron is located in a preceding processing core of the target processing core, and may receive a routing packet of the preceding processing core, where the routing packet may include, but is not limited to: an input pulse, an identification of a second neuron that issued the input pulse, an identification of a destination processing core (i.e., a target processing core), an identification of a destination neuron (i.e., a first neuron). The connection weight corresponding to the first neuron is a connection weight between the first neuron and a second neuron in the preceding processing core.
In some optional embodiments, the second neuron is located in the target processing core, the second neuron that fires the pulses to the first neuron and the connection weights between the first neuron and the second neuron may be determined by the target processing core.
And step S22, performing membrane potential integration operation on the first neuron according to the input pulse corresponding to the first neuron and the corresponding connection weight.
Specifically, the step of performing the membrane potential integration operation on the first neuron according to the input pulse corresponding to the first neuron and the corresponding connection weight may include: and weighting and summing all input pulses connected to the first neuron according to the corresponding connection weights to obtain the integrated potential of the first neuron. After the step of performing the membrane potential integration operation on the first neuron according to the input pulse corresponding to the first neuron and the corresponding connection weight, the method further includes: and adding the integrated potential of the first neuron and the historical membrane potential to update the membrane potential of the first neuron, determining that the first neuron meets the issuing condition under the condition that the updated membrane potential exceeds a preset membrane potential threshold, namely, the pulse is required to be issued, and otherwise determining that the first neuron does not meet the issuing condition, namely, the pulse is not required to be issued.
Fig. 9 is a flowchart of another data processing method provided by an embodiment of the present disclosure, and referring to fig. 9, an embodiment of the present disclosure provides another data processing method, where the method is applied to a processing core of a many-core system, the method may be performed by a data processing apparatus, the apparatus may be implemented by software and/or hardware, the apparatus may be integrated in the processing core of the many-core system, and the data processing method includes:
step S31 is to acquire a connection weight between the first neuron and the second neuron by the weight generation method when the first neuron of the target processing core satisfies the issue condition.
Step S32, sending input information to the second neuron, where the input information includes an input pulse and a connection weight between the first neuron and the second neuron, so that the second neuron performs a membrane potential integration operation according to the input pulse and the connection weight.
The second neuron is a successor neuron of the first neuron, and the input pulse is a firing pulse sent by the first neuron to the second neuron.
In some alternative embodiments, the second neuron is located in a processing core subsequent to the target processing core. The input pulse is a pulse sent by the target processing core to the subsequent processing core. Specifically, the input information sent to the successor processing core may include, but is not limited to: an input pulse, an identification of a first neuron that issued the input pulse, an identification of a destination processing core (i.e., a successor processing core), an identification of a destination neuron (i.e., a second neuron).
In some optional embodiments, the second neuron is located in the target processing core, and the first neuron that fires the pulses to the second neuron and the connection weights between the first neuron and the second neuron may be determined by the target processing core.
For a specific description of the process of performing the membrane potential integration operation on the second neuron, reference may be made to the above description of the step of performing the membrane potential integration operation on the first neuron according to the input pulse corresponding to the first neuron and the corresponding connection weight, which is not described herein again.
Fig. 10 is a block diagram of a weight generation apparatus provided in an embodiment of the present disclosure, and the apparatus is applied to a processing core of a many-core system, where the many-core system includes a plurality of processing cores, and the processing core includes a plurality of neurons, as shown in fig. 10, the weight generation apparatus 300 includes: a control module 301 and a weight generator 302.
The control module 301 is configured to control the weight generator to generate at least part of the connection weights corresponding to the first neuron in the target processing core;
wherein the connection weight is a connection weight between the first neuron and the second neuron, and the connection weight is used for performing a membrane potential integration operation of the corresponding first neuron or second neuron.
In some embodiments, the weight generator 302 includes a first random number generator; the control module 301 is configured to: and sending first control information to the first random number generator, wherein the first control information contains a connection matrix corresponding to a first neuron in the target processing core, and the connection matrix comprises information for representing whether connection weights need to be generated between the first neuron and each second neuron in the target processing core. The first random number generator is configured to generate, in response to the first control information, at least part of the connection weights corresponding between the first neuron and the second neuron requiring generation of the connection weights, based on the first random seed and the connection matrix.
In some embodiments, as shown in FIGS. 5 and 10, the weight generator 302 includes a first random number generator and a second random number generator coupled to the first random number generator. The control module 301 is configured to: generating, with the second random number generator, a connection matrix corresponding to the first neuron in the target processing core, the connection matrix including information characterizing whether generation of connection weights is required between the first neuron and a second neuron in the target processing core. The first random number generator is configured to: and generating at least part of corresponding connection weights between the first neuron and the second neuron needing to generate the connection weights according to the first random seed and the connection matrix.
In some embodiments, as shown in fig. 5 and 10, the control module 301 is configured to: and sending second control information to the second random number generator, wherein the control information comprises weight sparsity information corresponding to the first neuron in the target processing core. The second random number generator is configured to: responding to the second control information, and generating a connection matrix with sparsity matched with the weight sparsity information according to the second random seed; the connection matrix corresponding to the first neuron in the target processing core is sent to a first random number generator.
In some embodiments, as shown in FIG. 6, the first random number generator includes a uniform number generator and a distribution transformer connected to the uniform number generator, the distribution transformer having a preset distribution function and corresponding distribution parameters configured therein. The uniform number generator is configured to: and generating uniform random numbers corresponding to the first neuron and the second neuron which need to generate the connection weight according to the first random seed and the connection matrix, wherein the uniform random numbers meet a uniform distribution rule. The distribution transformer is configured to: and according to the preset distribution function and the corresponding distribution parameters, converting the uniform random number into the connection weight meeting the preset distribution function.
For specific implementation of the weight generating apparatus 300 and the weight generator 302, reference may be made to the description in the above embodiments of the weight generating method, and details are not repeated here.
Another weight generation apparatus provided in an embodiment of the present disclosure is applied to a processing core of a many-core system, where the many-core system includes a plurality of processing cores, and the processing core includes a plurality of neurons, and includes:
a receiving unit configured to receive at least part of the connection weights corresponding to the first neuron transmitted by the at least one generating core.
Wherein at least part of the connection weights corresponding to the first neuron are generated by the generating core by using a preset weight generator, the connection weights are connection weights between the first neuron and the second neuron, and the connection weights are used for performing membrane potential integration operation on the corresponding first neuron or the corresponding second neuron.
The implementation of the weight generation apparatus may refer to the description in the embodiment of the weight generation method, and is not described herein again.
Fig. 11 is a block diagram of a data processing apparatus according to an embodiment of the present disclosure, and as shown in fig. 11, the data processing apparatus 400 includes a weight obtaining unit 401 and an integral issuing unit 402.
The weight acquisition unit 401 is configured to acquire, in response to receiving an input pulse, a connection weight corresponding to a first neuron corresponding to the input pulse in the target processing core by the weight generation device as described above; the connection weight is a connection weight between the first neuron and the second neuron. The integration issuing unit 402 is configured to perform a membrane potential integration operation on the first neuron according to the input pulse corresponding to the first neuron and the corresponding connection weight.
In some embodiments, the weight obtaining unit 401 may employ the above weight generating device.
For a specific implementation of the data processing apparatus 400, reference may be made to the above weight generation apparatus and the description in the embodiment of the data processing method shown in fig. 8, which is not described herein again.
Fig. 12 is a block diagram of a data processing apparatus according to an embodiment of the present disclosure, and as shown in fig. 12, the data processing apparatus 500 includes a weight obtaining unit 501 and an integral issuing unit 502.
The weight acquisition unit 501 is configured to acquire a connection weight between a first neuron and a second neuron of a target processing core by the above-described weight generation means in a case where the first neuron satisfies a firing condition. The integration issuance unit 502 is configured to send input information to the second neuron, the input information including an input pulse and a connection weight between the first neuron and the second neuron, for the second neuron to perform a membrane potential integration operation according to the input pulse and the connection weight.
In some embodiments, the weight obtaining unit 501 may employ the above weight generating device.
For a specific implementation of the data processing apparatus 500, reference may be made to the above weight generation apparatus and the description in the embodiment of the data processing method shown in fig. 9, which is not described herein again.
The embodiment of the present disclosure further provides a processing core, where the processing core includes the above weight generation apparatus.
The embodiment of the disclosure also provides a processing core, which includes the data processing device.
The embodiment of the present disclosure further provides a processing core, where the processing core includes the above weight generation apparatus and the above data processing apparatus.
The embodiment of the disclosure further provides a many-core system, which includes a plurality of processing cores, and at least one processing core adopts the processing core described in any of the above embodiments.
Fig. 13 is a block diagram of an electronic device according to an embodiment of the present disclosure.
Referring to fig. 13, an electronic device according to an embodiment of the present disclosure includes a plurality of processing cores 601 and a network on chip 602, where the plurality of processing cores 601 are all connected to the network on chip 602, and the network on chip 602 is configured to interact data between the plurality of processing cores and external data.
One or more instructions are stored in the one or more processing cores 601, and the one or more instructions are executed by the one or more processing cores 601, so that the one or more processing cores 601 can execute the weight generation method or the data processing method.
Furthermore, the embodiment of the present disclosure also provides a computer readable medium, on which a computer program is stored, wherein the computer program, when being executed by the processing core, implements the weight generation method or the data processing method described above.
The embodiments of the present disclosure also provide a computer program product, which includes a computer program, and when being executed by a processing core, the computer program implements the weight generation method or the data processing method described above.
It will be understood by those of ordinary skill in the art that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.
Example embodiments have been disclosed herein, and although specific terms are employed, they are used and should be interpreted in a generic and descriptive sense only and not for purposes of limitation. In some instances, features, characteristics and/or elements described in connection with a particular embodiment may be used alone or in combination with features, characteristics and/or elements described in connection with other embodiments, unless expressly stated otherwise, as would be apparent to one skilled in the art. Accordingly, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the disclosure as set forth in the appended claims.

Claims (20)

1. A weight generation method applied to a processing core of a many-core system, wherein the many-core system comprises a plurality of processing cores, the processing core comprises a plurality of neurons, and the weight generation method comprises the following steps:
generating at least part of connection weights corresponding to a first neuron in the target processing core by using a preset weight generator;
wherein the connection weight is a connection weight between the first neuron and the second neuron, and the connection weight is used for performing a membrane potential integration operation of the corresponding first neuron or second neuron.
2. The weight generation method according to claim 1, wherein before the generating at least part of the connection weight corresponding to the first neuron in the target processing core by using the preset weight generator, the method further comprises: receiving a weight generation instruction of the target processing core;
the generating, by using a preset weight generator, at least part of the connection weights corresponding to the first neuron in the target processing core includes:
and generating at least part of connection weight corresponding to the first neuron by using a preset weight generator according to the weight generation instruction.
3. The weight generation method of claim 1, the processing core comprising an instruction storage space comprising weight generation instructions comprising instruction execution time beats,
wherein the generating, by using a preset weight generator, at least part of the connection weights corresponding to the first neuron in the target processing core includes:
when the instruction execution time beat starts, the weight generation instruction is obtained and executed from the instruction storage space, and at least part of connection weights corresponding to the first neuron are generated by using a preset weight generator.
4. The weight generation method according to claim 1, wherein the weight generator includes a first random number generator; the generating, by using a preset weight generator, at least part of the connection weights corresponding to the first neuron in the target processing core includes:
sending first control information to the first random number generator, wherein the first control information contains a connection matrix corresponding to a first neuron in a target processing core, and the connection matrix comprises information for representing whether connection weights need to be generated between the first neuron and each second neuron;
the first random number generator is configured to generate, in response to the first control information, at least part of connection weights corresponding to the first neuron and the second neuron, which need to generate connection weights, according to a first random seed and the connection matrix.
5. The weight generation method according to claim 1, wherein the weight generator includes a first random number generator;
the generating, by using a preset weight generator, at least part of the connection weights corresponding to the first neuron in the target processing core includes:
generating a connection matrix corresponding to a first neuron in a target processing core, wherein the connection matrix comprises information for representing whether connection weight generation is needed between the first neuron and each second neuron;
and generating at least part of corresponding connection weights between the first neuron and the second neuron needing to generate the connection weights according to the first random seed and the connection matrix by utilizing the first random number generator.
6. The weight generation method of claim 5, wherein the weight generator further comprises a second random number generator connected to the first random number generator;
the generating a connection matrix corresponding to a first neuron in a target processing core comprises:
generating, with the second random number generator, the connection matrix corresponding to a first neuron in a target processing core.
7. The weight generation method of claim 6, wherein said generating, with the second random number generator, a connection matrix corresponding to a first neuron in a target processing core comprises:
sending second control information to the second random number generator, the second control information including weight sparsity information corresponding to a first neuron in a target processing core, to cause the second random number generator to perform:
responding to the second control information, and generating the connection matrix with sparsity matched with the weight sparsity information according to a second random seed;
sending the connection matrix corresponding to a first neuron in a target processing core to the first random number generator.
8. The weight generation method of claim 5, wherein the generating a connection matrix corresponding to a first neuron in a target processing core comprises:
randomly selecting one second neuron from a plurality of second neurons connected with the first neuron, and determining the second neuron needing to generate connection weight with the first neuron;
under the condition that the number of the selected second neurons is smaller than the target number, continuing to randomly select the next second neuron and determining the second neurons needing to generate connection weights with the first neurons;
and under the condition that the number of the selected second neurons is equal to the target number, generating the connection matrix corresponding to the first neuron in the target processing core according to all the selected second neurons needing to generate connection weights with the first neuron.
9. The weight generation method according to any one of claims 4 to 8, wherein the first random number generator includes a uniform number generator and a distribution transformer connected to the uniform number generator, the distribution transformer having a preset distribution function and corresponding distribution parameters configured therein;
the uniform number generator is used for generating uniform random numbers corresponding to the first neuron and the second neuron which need to generate the connection weight according to the first random seed and the connection matrix, and the uniform random numbers meet a uniform distribution rule;
the distribution transformer is used for transforming the uniform random number into the connection weight meeting a preset distribution function according to the distribution parameters and the preset distribution function.
10. A weight generation method is applied to a processing core of a many-core system, wherein the many-core system comprises a plurality of processing cores, the processing core comprises a plurality of neurons, and the method comprises the following steps:
receiving at least a portion of the connection weights corresponding to the first neuron sent by the at least one generating core,
wherein at least part of the connection weights corresponding to the first neuron are generated by the generating core by using a preset weight generator, the connection weights are connection weights between the first neuron and the second neuron, and the connection weights are used for performing membrane potential integration operation on the corresponding first neuron or the corresponding second neuron.
11. A method of data processing, comprising:
in response to receiving an input pulse, acquiring a connection weight corresponding to a first neuron in a target processing core corresponding to the input pulse by using the weight generation method of any one of claims 1 to 10; the connection weight is a connection weight between the first neuron and the second neuron;
and performing membrane potential integration operation on the first neuron according to the input pulse corresponding to the first neuron and the corresponding connection weight.
12. A method of data processing, comprising:
in the case that a first neuron of a target processing core satisfies an issuance condition, acquiring a connection weight between the first neuron and a second neuron by using the weight generation method of any one of claims 1 to 10;
and sending input information to the second neuron, wherein the input information comprises an input pulse and a connection weight between the first neuron and the second neuron, so that the second neuron can perform membrane potential integration operation according to the input pulse and the connection weight.
13. A weight generation apparatus applied to a processing core of a many-core system, the many-core system including a plurality of processing cores, the processing core including a plurality of neurons, the weight generation apparatus comprising: a control module and a weight generator;
the control module configured to control the weight generator to generate at least part of the connection weights corresponding to the first neuron in the target processing core;
wherein the connection weight is a connection weight between the first neuron and the second neuron, and the connection weight is used for performing a membrane potential integration operation of the corresponding first neuron or second neuron.
14. A weight generation apparatus applied to a processing core of a many-core system, the many-core system including a plurality of processing cores, the processing core including a plurality of neurons, comprising:
a receiving unit configured to receive at least part of the connection weights corresponding to the first neuron transmitted by the at least one generating core,
wherein at least part of the connection weights corresponding to the first neuron are generated by the generating core by using a preset weight generator, the connection weights are connection weights between the first neuron and the second neuron, and the connection weights are used for performing membrane potential integration operation on the corresponding first neuron or the corresponding second neuron.
15. A data processing apparatus comprising:
a weight obtaining unit configured to, in response to receiving an input pulse, obtain a connection weight corresponding to a first neuron corresponding to the input pulse in a target processing core using the weight generating apparatus according to claim 13 or the weight generating apparatus according to claim 14; the connection weight is a connection weight between the first neuron and the second neuron;
an integration issuing unit configured to perform a membrane potential integration operation on the first neuron according to the input pulse corresponding to the first neuron and the corresponding connection weight.
16. A data processing apparatus comprising:
a weight obtaining unit configured to obtain, in a case where a first neuron of a target processing core satisfies a firing condition, a connection weight between the first neuron and a second neuron by using the weight generating apparatus of claim 13 or the weight generating apparatus of claim 14;
an integration issuance unit configured to send input information to the second neuron, the input information including an input pulse and a connection weight between the first neuron and the second neuron, for the second neuron to perform a membrane potential integration operation according to the input pulse and the connection weight.
17. A processing core comprising weight generation means and/or data processing means;
the weight generation means comprises the weight generation means of claim 13 or 14;
the data processing apparatus comprising a data processing apparatus as claimed in claim 15 or a data processing apparatus as claimed in claim 16.
18. A many-core system comprising a plurality of processing cores, at least one of said processing cores employing the processing core of claim 17.
19. An electronic device, comprising:
a plurality of processing cores; and
a network on chip configured to interact data among the plurality of processing cores and external data;
one or more of the processing cores have stored therein one or more instructions that are executed by the one or more processing cores to enable the one or more processing cores to perform the weight generation method of any of claims 1-10, or the data processing method of claim 11, or the data processing method of claim 12.
20. A computer readable medium having stored thereon a computer program, wherein the computer program, when executed by a processing core, implements the weight generation method of any of claims 1-10, or the data processing method of claim 11, or the data processing method of claim 12.
CN202110976731.XA 2021-08-24 2021-08-24 Weight generation method, data processing method and device, electronic device and medium Pending CN113673688A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110976731.XA CN113673688A (en) 2021-08-24 2021-08-24 Weight generation method, data processing method and device, electronic device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110976731.XA CN113673688A (en) 2021-08-24 2021-08-24 Weight generation method, data processing method and device, electronic device and medium

Publications (1)

Publication Number Publication Date
CN113673688A true CN113673688A (en) 2021-11-19

Family

ID=78545735

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110976731.XA Pending CN113673688A (en) 2021-08-24 2021-08-24 Weight generation method, data processing method and device, electronic device and medium

Country Status (1)

Country Link
CN (1) CN113673688A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023134561A1 (en) * 2022-01-11 2023-07-20 北京灵汐科技有限公司 Data processing method and apparatus, and electronic device and computer-readable medium
WO2023208243A1 (en) * 2022-04-29 2023-11-02 北京灵汐科技有限公司 Weight storage method, apparatus and system, weight transmission method, apparatus and system, weight calculation method, apparatus and system, and device
WO2023208027A1 (en) * 2022-04-29 2023-11-02 北京灵汐科技有限公司 Information processing method and information processing unit, and device, medium and product
WO2023208178A1 (en) * 2022-04-29 2023-11-02 北京灵汐科技有限公司 Information processing method and unit, chip, device, medium, and product

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150039546A1 (en) * 2013-08-02 2015-02-05 International Business Machines Corporation Dual deterministic and stochastic neurosynaptic core circuit
CN106127301A (en) * 2016-01-16 2016-11-16 上海大学 A kind of stochastic neural net hardware realization apparatus
US20190042913A1 (en) * 2018-01-30 2019-02-07 Intel Corporation Memoryless weight storage hardware for neural networks
CN112352248A (en) * 2018-06-19 2021-02-09 奥利拜技术股份有限公司 Apparatus and method for constructing a neural network with feedforward and feedback paths using a parametric genome characterizing neural network connections as building blocks
CN112348177A (en) * 2019-07-05 2021-02-09 安徽寒武纪信息科技有限公司 Neural network model verification method and device, computer equipment and storage medium
CN112698811A (en) * 2021-01-11 2021-04-23 湖北大学 Neural network random number generator sharing circuit, sharing method and processor chip

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150039546A1 (en) * 2013-08-02 2015-02-05 International Business Machines Corporation Dual deterministic and stochastic neurosynaptic core circuit
CN106127301A (en) * 2016-01-16 2016-11-16 上海大学 A kind of stochastic neural net hardware realization apparatus
US20190042913A1 (en) * 2018-01-30 2019-02-07 Intel Corporation Memoryless weight storage hardware for neural networks
CN112352248A (en) * 2018-06-19 2021-02-09 奥利拜技术股份有限公司 Apparatus and method for constructing a neural network with feedforward and feedback paths using a parametric genome characterizing neural network connections as building blocks
CN112348177A (en) * 2019-07-05 2021-02-09 安徽寒武纪信息科技有限公司 Neural network model verification method and device, computer equipment and storage medium
CN112698811A (en) * 2021-01-11 2021-04-23 湖北大学 Neural network random number generator sharing circuit, sharing method and processor chip

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SEONGCHUL PARK ETAL.: "The weights initialization methodology of unsupervised neural networks to improve clustering stability", THE JOURNAL OF SUPERCOMPUTING, vol. 76, 12 July 2019 (2019-07-12), XP037195473, DOI: 10.1007/s11227-019-02940-4 *
李莉 等: "基于压缩感知的回声状态神经网络在时间序列预测中的应用", 软件导刊, no. 04, 15 April 2020 (2020-04-15) *
罗熊 等: "回声状态网络的研究进展", 北京科技大学学报, vol. 34, no. 2, 24 February 2012 (2012-02-24) *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023134561A1 (en) * 2022-01-11 2023-07-20 北京灵汐科技有限公司 Data processing method and apparatus, and electronic device and computer-readable medium
WO2023208243A1 (en) * 2022-04-29 2023-11-02 北京灵汐科技有限公司 Weight storage method, apparatus and system, weight transmission method, apparatus and system, weight calculation method, apparatus and system, and device
WO2023208027A1 (en) * 2022-04-29 2023-11-02 北京灵汐科技有限公司 Information processing method and information processing unit, and device, medium and product
WO2023208178A1 (en) * 2022-04-29 2023-11-02 北京灵汐科技有限公司 Information processing method and unit, chip, device, medium, and product

Similar Documents

Publication Publication Date Title
CN113673688A (en) Weight generation method, data processing method and device, electronic device and medium
US10713561B2 (en) Multiplexing physical neurons to optimize power and area
US8655813B2 (en) Synaptic weight normalized spiking neuronal networks
CN108460460A (en) Technology for optimized machine learning training
CN106845633B (en) Neural network information conversion method and system
CN103699440A (en) Method and device for cloud computing platform system to distribute resources to task
US20140214739A1 (en) Cortical simulator
CN107783840A (en) A kind of Distributed-tier deep learning resource allocation methods and device
CN108154232A (en) Pruning method, device, equipment and the readable storage medium storing program for executing of artificial neural network
KR20220009682A (en) Method and system for distributed machine learning
CN111353591A (en) Computing device and related product
CN117634564B (en) Pulse delay measurement method and system based on programmable nerve mimicry core
US11709910B1 (en) Systems and methods for imputing missing values in data sets
CN112949853B (en) Training method, system, device and equipment for deep learning model
CN115794570A (en) Pressure testing method, device, equipment and computer readable storage medium
CN114781630A (en) Weight data storage method and device, chip, electronic equipment and readable medium
US20220036190A1 (en) Neural network compression device
CN114925817A (en) Data processing method, delay chain unit, delay device and many-core system
CN114595815A (en) Transmission-friendly cloud-end cooperation training neural network model method
CN108764464B (en) Neuron information sending method, device and storage medium
CN114721599A (en) Weight data storage method and device, chip, electronic equipment and readable medium
WO2023208243A1 (en) Weight storage method, apparatus and system, weight transmission method, apparatus and system, weight calculation method, apparatus and system, and device
EP3830763A1 (en) Data processing module, data processing system and data processing method
CN114792128A (en) Method for transmitting weight data, many-core system, electronic device and medium
CN116663622A (en) Biological trusted neuron calculation circuit and calculation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination