TWI730452B - Stereo artificial neural network system - Google Patents

Stereo artificial neural network system Download PDF

Info

Publication number
TWI730452B
TWI730452B TW108137183A TW108137183A TWI730452B TW I730452 B TWI730452 B TW I730452B TW 108137183 A TW108137183 A TW 108137183A TW 108137183 A TW108137183 A TW 108137183A TW I730452 B TWI730452 B TW I730452B
Authority
TW
Taiwan
Prior art keywords
hidden layer
layer
node
hidden
neural network
Prior art date
Application number
TW108137183A
Other languages
Chinese (zh)
Other versions
TW202117703A (en
Inventor
林維崙
Original Assignee
逢甲大學
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 逢甲大學 filed Critical 逢甲大學
Priority to TW108137183A priority Critical patent/TWI730452B/en
Publication of TW202117703A publication Critical patent/TW202117703A/en
Application granted granted Critical
Publication of TWI730452B publication Critical patent/TWI730452B/en

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The present invention discloses a stereo artificial neural network system comprising an input layer, at least one hidden layer and an output layer. The abovementioned input layer includes at least one input layer node. Each abovementioned hidden layer includes at least one hidden layer node, and each input layer node is connected to one of the hidden layer nodes. The abovementioned output layer includes at least one output layer node, and each output layer node is connected to any one of the hidden layer nodes. Furthermore, each of the hidden layer nodes is connected to each of the hidden layer nodes included in any of the other hidden layers. Thereby, the system may use a small number of nodes to form a more powerful connection, and provide advanced processing performance and speed.

Description

立體類神經網路系統Three-dimensional neural network system

本發明提出一種立體類神經網路系統,尤指一種每一隱藏層的隱藏層節點皆可與其他任一隱藏層的每一個隱藏層節點連接的立體類神經網路系統。 The present invention provides a three-dimensional neural network system, in particular a three-dimensional neural network system in which each hidden layer node of each hidden layer can be connected to each hidden layer node of any other hidden layer.

類神經網路(Artificial Neural Network,ANN)在機器學習和認知科學領域,是一種模仿生物神經網路(如動物的中樞神經系統)結構和功能的數學模型或計算模型,用於對函式進行估計或近似。類神經網路是由大量的人工神經元聯結進行計算,在大多數情況下類神經網路能在外界資訊的基礎上改變內部結構,是一種自適應系統。 Artificial Neural Network (ANN) in the field of machine learning and cognitive science is a mathematical model or computational model that imitates the structure and function of biological neural networks (such as the central nervous system of animals). Estimate or approximate. Neural-like networks are calculated by connecting a large number of artificial neurons. In most cases, neural-like networks can change the internal structure on the basis of external information. It is an adaptive system.

現代神經網路是一種非線性統計性資料建模工具,神經網路通常是通過一個基於數學統計學類型的學習方法(Learning Method)得以最佳化,透過統計學的標準數學方法能夠得到大量的函式,來表達的局部結構空間;另一方面,在人工智慧學的人工感知領域,還可透過數學統計學的應用來決定人工感知方面的問題,該種方法比起傳統的邏輯學推理演算更具有優勢。 Modern neural network is a non-linear statistical data modeling tool. Neural network is usually optimized through a learning method based on mathematical statistics. A large number of statistical methods can be obtained through statistical standard mathematical methods. Function, to express the local structural space; on the other hand, in the field of artificial perception of artificial intelligence, the application of mathematical statistics can also be used to determine the problem of artificial perception. This method is compared with traditional logical reasoning calculations. It has more advantages.

和其他機器學習方法一樣,類神經網路已經被用於解決各種各樣的問題,例如機器視覺和語音辨識,該些問題都是很難被傳統基於規則的編程所 解決的;然而,目前類神經網路架構中,每一個隱藏層並不會與其他隱藏層的節點有連接的可能,部分地降低了類神經網路整體的連接模式。 Like other machine learning methods, neural networks have been used to solve a variety of problems, such as machine vision and speech recognition. These problems are difficult to be solved by traditional rule-based programming. The solution; however, in the current neural network-like architecture, each hidden layer is not likely to be connected to the nodes of other hidden layers, which partially reduces the overall connection mode of the neural network.

據此,為改良先前技術中所提到,習知的類神經網路(Artificial neural network system,ANN)架構中,每一個隱藏層並不會與其他隱藏層的節點有連接的可能;本案提出一種立體類神經網路系統,包含至少一輸入層節點;至少一隱藏層,每一層隱藏層包含至少一隱藏層節點,且每一個輸入層節點皆與其中一個隱藏層節點連接;以及一輸出層,包含至少一輸出層節點,且每一個輸出層節點皆與其中一個隱藏層節點連接。 Accordingly, in order to improve the prior art mentioned, in the conventional artificial neural network system (ANN) architecture, each hidden layer may not be connected to the nodes of other hidden layers; this case proposes A three-dimensional neural network system includes at least one input layer node; at least one hidden layer, each hidden layer includes at least one hidden layer node, and each input layer node is connected to one of the hidden layer nodes; and an output layer , Including at least one output layer node, and each output layer node is connected to one of the hidden layer nodes.

進一步而言,每一個隱藏層節點皆與其他任一隱藏層包含的每一個隱藏層節點連接。 Furthermore, each hidden layer node is connected to each hidden layer node included in any other hidden layer.

進一步而言,該輸入層至該輸出層具有一訊號傳遞向量,且該訊號傳遞向量系依據該至少一隱藏層的排序作為訊號傳遞的方向。 Furthermore, the input layer to the output layer have a signal transfer vector, and the signal transfer vector is based on the order of the at least one hidden layer as the signal transfer direction.

以上對本發明的簡述,目的在於對本發明之數種面向和技術特徵作一基本說明。發明簡述並非對本發明的詳細表述,因此其目的不在特別列舉本發明的關鍵性或重要元件,也不是用來界定本發明的範圍,僅為以簡明的方式呈現本發明的數種概念而已。 The above brief description of the present invention aims to provide a basic description of several aspects and technical features of the present invention. The brief description of the invention is not a detailed description of the invention. Therefore, its purpose is not to specifically enumerate the key or important elements of the invention, nor to define the scope of the invention. It merely presents several concepts of the invention in a concise manner.

1:立體類神經網路系統 1: Three-dimensional neural network system

10:輸入層 10: Input layer

10a:輸入層節點 10a: Input layer node

20:隱藏層 20: hidden layer

21:第一隱藏層 21: The first hidden layer

21a:第一隱藏層節點A 21a: The first hidden layer node A

21b:第一隱藏層節點B 21b: Node B of the first hidden layer

22:第二隱藏層 22: second hidden layer

22a:第二隱藏層節點A 22a: The second hidden layer node A

22b:第二隱藏層節點B 22b: second hidden layer node B

23:第三隱藏層 23: The third hidden layer

23a:第三隱藏層節點A 23a: third hidden layer node A

23b:第三隱藏層節點B 23b: third hidden layer node B

24:第四隱藏層 24: Fourth hidden layer

24a:第四隱藏層節點A 24a: fourth hidden layer node A

24b:第四隱藏層節點B 24b: fourth hidden layer node B

30:輸出層 30: output layer

30a:輸出層節點 30a: Output layer node

第一圖為本發明較佳實施例之立體類神經網路架構的示意圖。 The first figure is a schematic diagram of a three-dimensional neural network architecture according to a preferred embodiment of the present invention.

第二圖為本發明較佳實施例之立體類神經網路架構的展開圖。 The second figure is an expanded view of the three-dimensional neural network architecture of the preferred embodiment of the present invention.

為能瞭解本發明的技術特徵及實用功效,並可依照說明書的內容來實施,茲進一步以如圖式所示的較佳實施例,詳細說明如後: In order to understand the technical features and practical effects of the present invention, and implement it in accordance with the content of the specification, the preferred embodiment shown in the figure is further described in detail as follows:

為改良先前技術中所提到,習知的人體類神經網路(Artificial neural network system,ANN)架構中,每一個隱藏層並不會與其他隱藏層的節點有連接的可能;本發明提出一種立體類神經網路系統,可將每一隱藏層的隱藏層節點皆可與其他任一隱藏層的每一個隱藏層節點連接的立體類神經網路系統。據此,本系統可使用與習知人體類神經網路相對較少的節點,形成更強大的連接模式,以提供進階的處理效能及速度。 In order to improve the prior art mentioned, in the conventional artificial neural network system (ANN) architecture, each hidden layer may not be connected to the nodes of other hidden layers; the present invention proposes a The three-dimensional neural network system is a three-dimensional neural network system in which the hidden layer node of each hidden layer can be connected to each hidden layer node of any other hidden layer. Accordingly, the system can use relatively fewer nodes than the conventional human neural network to form a more powerful connection mode to provide advanced processing performance and speed.

首先,請同時參照第一圖及第二圖,第一圖為本發明較佳實施例之立體類神經網路架構的示意圖,第二圖為本發明較佳實施例之立體類神經網路架構的展開示意圖。如第一圖及第二圖所示,本實施例之立體類神經網路系統1包含一輸入層10,該輸入層10包含至少一輸入層節點10a;至少一隱藏層20,每一層隱藏層(21、22、23、24)包含至少一隱藏層節點(21a、21b、22a、22b、23a、23b、24a、24b),且每一個輸入層節點10a皆與其中一個隱藏層節點(21a、21b、22a、22b、23a、23b、24a、24b)連接;以及一輸出層30,該輸出層30包含至少一輸出層節點30a,且每一個輸出層節點30a皆與其中一個隱藏層節點(21a、21b、22a、22b、23a、23b、24a、24b)連接。 First of all, please refer to the first and second figures. The first figure is a schematic diagram of the three-dimensional neural network architecture of the preferred embodiment of the present invention, and the second figure is the three-dimensional neural network architecture of the preferred embodiment of the present invention. Schematic diagram of the expansion. As shown in the first and second figures, the three-dimensional neural network system 1 of this embodiment includes an input layer 10, the input layer 10 includes at least one input layer node 10a; at least one hidden layer 20, each hidden layer (21, 22, 23, 24) includes at least one hidden layer node (21a, 21b, 22a, 22b, 23a, 23b, 24a, 24b), and each input layer node 10a is connected to one of the hidden layer nodes (21a, 21b, 22a, 22b, 23a, 23b, 24a, 24b) connection; and an output layer 30 including at least one output layer node 30a, and each output layer node 30a is connected to one of the hidden layer nodes (21a , 21b, 22a, 22b, 23a, 23b, 24a, 24b) connection.

進一步而言,每一個隱藏層節點(21a、21b、22a、22b、23a、23b、24a、24b)皆與其他任一隱藏層包含的每一個隱藏層節點連接。舉以第一圖(及第二圖)為例,該至少一隱藏層20的層數為四層,其依序分別為第一隱藏層21、第二隱藏層22、第三隱藏層23及第四隱藏層24;又每一隱藏層(21、22、23、24)所包含隱藏層節點的數目為兩個。換言之,該第一隱藏層21包含有一第一隱藏層 節點A(21a)及一第一隱藏層節點B(21b);該第二隱藏層22包含有一第二隱藏層節點A(22a)及一第二隱藏層節點B(22b);該第三隱藏層23包含有一第三隱藏層節點A(23a)及一第三隱藏層節點B(23ba);該第四隱藏層包含有一第四隱藏層節點A(24a)及一第四隱藏層節點B(24b)。 Furthermore, each hidden layer node (21a, 21b, 22a, 22b, 23a, 23b, 24a, 24b) is connected to each hidden layer node included in any other hidden layer. Taking the first figure (and the second figure) as an example, the number of layers of the at least one hidden layer 20 is four, which in sequence are the first hidden layer 21, the second hidden layer 22, the third hidden layer 23, and The fourth hidden layer 24; and each hidden layer (21, 22, 23, 24) contains two hidden layer nodes. In other words, the first hidden layer 21 includes a first hidden layer Node A (21a) and a first hidden layer node B (21b); the second hidden layer 22 includes a second hidden layer node A (22a) and a second hidden layer node B (22b); the third hidden layer Layer 23 includes a third hidden layer node A (23a) and a third hidden layer node B (23ba); the fourth hidden layer includes a fourth hidden layer node A (24a) and a fourth hidden layer node B ( 24b).

由第一圖(及第二圖)可明顯看出,第一隱藏層節點A(21a)除了可與第一隱藏層21本身所包含的第一隱藏層節點B(2ba)連接外,還可與其他隱藏層的所有節點,包含第二隱藏層節點A、B(22a、22b),第三隱藏層節點A、B(23a、23b),及第四隱藏層節點A、B(24a、24b)連接。 It can be clearly seen from the first figure (and the second figure) that the first hidden layer node A (21a) can be connected to the first hidden layer node B (2ba) contained in the first hidden layer 21 itself, but also With all nodes in other hidden layers, including the second hidden layer nodes A, B (22a, 22b), the third hidden layer nodes A, B (23a, 23b), and the fourth hidden layer nodes A, B (24a, 24b) )connection.

同理,第二隱藏層節點A(22a)除了可與第二隱藏層本身所包含的第二隱藏層節點B(22b)連接外,還可於其他隱藏層的所有節點,包含第一隱藏層節點A、B(21a、21b),第三隱藏層節點A、B(23a、23b),及第四隱藏層節點A、B(24a、24b)連接。第三隱藏層節點A(23a)除了可與第三隱藏層本身所包含的第三隱藏層節點B(23b)連接外,還可於其他隱藏層的所有節點,包含第一隱藏層節點A、B(21a、21b),第二隱藏層節點A、B(22a、22b),及第四隱藏層節點A、B(24a、24b)連接。第四隱藏層節點A(24a)除了可與第四隱藏層本身所包含的第四隱藏層節點B(24b)連接外,還可於其他隱藏層的所有節點,包含第一隱藏層節點A、B(21a、21b),第二隱藏層節點A、B(22a、22b),及第三隱藏層節點A、B(23a、23b)連接。值得注意的是,在其他可能的實施例,立體類神經網路系統中隱藏層的層數,以及每一隱藏層所包含的隱藏層節點數可依需求自行調整,本發明不應依此為限。 In the same way, the second hidden layer node A (22a) can be connected to the second hidden layer node B (22b) contained in the second hidden layer itself, and can also be connected to all nodes of other hidden layers, including the first hidden layer. The nodes A and B (21a, 21b), the third hidden layer nodes A and B (23a, 23b), and the fourth hidden layer nodes A and B (24a, 24b) are connected. The third hidden layer node A (23a) can be connected to the third hidden layer node B (23b) contained in the third hidden layer itself, and can also be connected to all nodes in other hidden layers, including the first hidden layer node A, B (21a, 21b), the second hidden layer nodes A, B (22a, 22b), and the fourth hidden layer nodes A, B (24a, 24b) are connected. The fourth hidden layer node A (24a) can be connected to the fourth hidden layer node B (24b) contained in the fourth hidden layer itself, and can also be connected to all nodes in other hidden layers, including the first hidden layer node A, B (21a, 21b), the second hidden layer nodes A, B (22a, 22b), and the third hidden layer nodes A, B (23a, 23b) are connected. It is worth noting that in other possible embodiments, the number of hidden layers in the three-dimensional neural network system and the number of hidden layer nodes included in each hidden layer can be adjusted according to requirements, and the present invention should not be based on this. limit.

除此之外,由該輸入層10至該輸出層30的各節點具有一訊號傳遞向量,且該訊號傳遞向量係依據該至少一隱藏層20的排序作為訊號傳遞的方向。 In addition, each node from the input layer 10 to the output layer 30 has a signal transfer vector, and the signal transfer vector is based on the order of the at least one hidden layer 20 as the signal transfer direction.

以下將針對本發明之立體類神經網路系統作進一步的說明。 The following will further explain the three-dimensional neural network system of the present invention.

首先,假設一立體類神經網路系統具有L層數的隱藏層,則其函式可表示為

Figure 108137183-A0305-02-0007-1
,其中D為輸入向量x=[x 0 ,x 1 ,...,x D-1];M (L)為輸出向量f(x),且f(x)的矩陣表示為
Figure 108137183-A0305-02-0007-2
,其中
Figure 108137183-A0305-02-0007-3
,且n=0,1,...,M (0)-1;偏置向量b (L)表示為[
Figure 108137183-A0305-02-0007-4
,
Figure 108137183-A0305-02-0007-5
,
...,
Figure 108137183-A0305-02-0007-6
],權重向量
Figure 108137183-A0305-02-0007-7
表示為由第i層中的第m個節點至第l層中的第n個節點的權重矩陣;
Figure 108137183-A0305-02-0007-8
,且l=1,2,...,L-1,n=1,2,...,M (l)-1,且
Figure 108137183-A0305-02-0007-10
;其中,Gh皆屬於一激勵函數(Activation Function)。 First of all, suppose a three-dimensional neural network system has a hidden layer with L number of layers, then its function can be expressed as
Figure 108137183-A0305-02-0007-1
, Where D is the input vector x =[ x 0 ,x 1 , ... ,x D -1 ]; M ( L ) is the output vector f ( x ), and the matrix of f ( x ) is expressed as
Figure 108137183-A0305-02-0007-2
,among them
Figure 108137183-A0305-02-0007-3
, And n =0 , 1 , ... ,M (0) -1; the offset vector b ( L ) is expressed as [
Figure 108137183-A0305-02-0007-4
,
Figure 108137183-A0305-02-0007-5
,
... ,
Figure 108137183-A0305-02-0007-6
], weight vector
Figure 108137183-A0305-02-0007-7
Expressed as a weight matrix from the mth node in the i-th layer to the nth node in the l-th layer;
Figure 108137183-A0305-02-0007-8
, And l =1 , 2 , ... ,L -1, n =1 , 2 , ... ,M ( l ) -1, and
Figure 108137183-A0305-02-0007-10
; Among them, G and h belong to an activation function (Activation Function).

承上,激勵函數h(x)可組成至少一隱藏層,且被定義為sigmoid(x)=1/(1+e -x )。 In conclusion, the activation function h ( x ) can form at least one hidden layer, and it is defined as sigmoid ( x )=1/(1+ e - x ).

其中,由輸入層至輸出層的訊號傳遞方向,是依據至少一隱藏層的排序作為訊號傳遞的方向,舉以第一圖為例,該立體類神經網路系統具有四層的隱藏層,且每一隱藏層的隱藏層節點數目為兩個,則該節點可表示為

Figure 108137183-A0305-02-0007-11
Figure 108137183-A0305-02-0007-12
,且l=1,2,...,L-1,n=1,2,...,M-1,
Figure 108137183-A0305-02-0007-13
,以及
Figure 108137183-A0305-02-0007-14
。 Wherein, the signal transmission direction from the input layer to the output layer is based on the order of at least one hidden layer as the signal transmission direction. Taking the first figure as an example, the three-dimensional neural network system has four hidden layers, and The number of hidden layer nodes in each hidden layer is two, then the node can be expressed as
Figure 108137183-A0305-02-0007-11
Figure 108137183-A0305-02-0007-12
, And l =1 , 2 , ... ,L -1, n =1 , 2 , ... ,M -1,
Figure 108137183-A0305-02-0007-13
,as well as
Figure 108137183-A0305-02-0007-14
.

除此之外,第一圖中的立體類神經網路系統還可以進一步展開為第二圖的類神經網路系統。如第二圖所示,同一隱藏層的隱藏層節點,其訊號傳遞向量是同時向下的;然而,在其他可能的實施例中,訊號傳遞的方向可以依據實際情況修改或調整,本發明不應依此為限。 In addition, the three-dimensional neural network system in the first figure can be further expanded into the neural network system in the second figure. As shown in the second figure, the signal transfer vectors of the hidden layer nodes of the same hidden layer are downward at the same time; however, in other possible embodiments, the signal transfer direction can be modified or adjusted according to the actual situation. The present invention does not This should be the limit.

而以上所述的立體類神經網路系統中皆可應用於人工智慧相關領域。 The above-mentioned three-dimensional neural network system can be applied to artificial intelligence related fields.

惟以上所述者,僅為本發明之較佳實施例而已,當不能以此限定本發明實施之範圍,即依本發明申請專利範圍及說明內容所作之簡單變化與修飾,皆仍屬本發明涵蓋之範圍內。 However, the above are only the preferred embodiments of the present invention, and should not be used to limit the scope of implementation of the present invention, that is, simple changes and modifications made in accordance with the scope of the patent application and the description of the present invention still belong to the present invention. Covered in the scope.

1 … 立體類神經網路系統 10 … 輸入層 10a … 輸入層節點 21 … 第一隱藏層 22 … 第二隱藏層 23 … 第三隱藏層 24 … 第四隱藏層 30 … 輸出層 30a … 輸出層節點 1 … 3D neural network system 10 … input layer 10a … input layer node 21 … The first hidden layer 22 … Second hidden layer 23 … The third hidden layer 24 … The fourth hidden layer 30 …  output layer 30a … output layer node

Claims (5)

一種立體類神經網路系統,包含:一輸入層,包含至少一輸入層節點;至少一隱藏層,每一層隱藏層包含至少一隱藏層節點,且每一個輸入層節點皆與其中一隱藏層節點連接;以及一輸出層,包含至少一輸出層節點,且每一個輸出層節點皆與其中一隱藏層節點連接;其中,每一個隱藏層節點皆與其他任一隱藏層包含的每一個隱藏層節點連接且每一個隱藏層節點之間跨層連接;其中,具有L層數的隱藏層的該立體類神經網路系統係滿足下列條件:函式係為
Figure 108137183-A0305-02-0009-15
,D係為一輸入向量x=[x 0 ,x 1 ,...,x D-1],M (L)係為一輸出向量f(x)=G(b (L)+
Figure 108137183-A0305-02-0009-16
Figure 108137183-A0305-02-0009-17
,且n=0,1,...,M (0)-1,b (L)為偏置向量表示為[
Figure 108137183-A0305-02-0009-18
,
Figure 108137183-A0305-02-0009-19
,
...,
Figure 108137183-A0305-02-0009-20
],
Figure 108137183-A0305-02-0009-21
為權重向量表示為表示為由第i層中的第m個節點至第l層中的第n個節點的權重矩陣,
Figure 108137183-A0305-02-0009-22
l=1,2,...,L-1,n=1,2,...,M (l)-1,
Figure 108137183-A0305-02-0009-23
;其中,Gh皆屬於一激勵函數。
A three-dimensional neural network system, including: an input layer including at least one input layer node; at least one hidden layer, each hidden layer includes at least one hidden layer node, and each input layer node is connected to one of the hidden layer nodes Connection; and an output layer, including at least one output layer node, and each output layer node is connected to one of the hidden layer nodes; wherein each hidden layer node is connected to every hidden layer node included in any other hidden layer Connected and each hidden layer node is connected across layers; among them, the three-dimensional neural network system with the number of hidden layers of L layer satisfies the following conditions: the function system is
Figure 108137183-A0305-02-0009-15
, D is an input vector x =[ x 0 ,x 1 , ... ,x D -1 ], M ( L ) is an output vector f ( x ) = G ( b ( L ) +
Figure 108137183-A0305-02-0009-16
,
Figure 108137183-A0305-02-0009-17
, And n =0 , 1 , ... , M (0) -1, b ( L ) is the offset vector, expressed as [
Figure 108137183-A0305-02-0009-18
,
Figure 108137183-A0305-02-0009-19
,
... ,
Figure 108137183-A0305-02-0009-20
],
Figure 108137183-A0305-02-0009-21
Is the weight vector expressed as a weight matrix from the m- th node in the i-th layer to the n- th node in the l-th layer,
Figure 108137183-A0305-02-0009-22
, L =1 , 2 , ... ,L -1, n =1 , 2 , ... ,M ( l ) -1,
Figure 108137183-A0305-02-0009-23
; Among them, G and h both belong to an excitation function.
如請求項1所述之立體類神經網路系統,其中該輸入層至該輸出層具有一訊號傳遞向量。 The three-dimensional neural network system according to claim 1, wherein the input layer to the output layer has a signal transfer vector. 如請求項2所述之立體類神經網路系統,其中該訊號傳遞向量係依據該至少一隱藏層的排序作為訊號傳遞的方向。 The three-dimensional neural network system of claim 2, wherein the signal transmission vector is based on the order of the at least one hidden layer as the signal transmission direction. 如請求項1所述之立體類神經網路系統,其中該至少一隱藏層的層數為四層,依序為第一隱藏層、第二隱藏層、第三隱藏層及第四隱藏層。 The three-dimensional neural network system according to claim 1, wherein the number of layers of the at least one hidden layer is four layers, which are sequentially the first hidden layer, the second hidden layer, the third hidden layer, and the fourth hidden layer. 如請求項4所述之立體類神經網路系統,其中該每一隱藏層所包含隱藏層節點的數目為兩個。The three-dimensional neural network system according to claim 4, wherein the number of hidden layer nodes included in each hidden layer is two.
TW108137183A 2019-10-16 2019-10-16 Stereo artificial neural network system TWI730452B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW108137183A TWI730452B (en) 2019-10-16 2019-10-16 Stereo artificial neural network system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW108137183A TWI730452B (en) 2019-10-16 2019-10-16 Stereo artificial neural network system

Publications (2)

Publication Number Publication Date
TW202117703A TW202117703A (en) 2021-05-01
TWI730452B true TWI730452B (en) 2021-06-11

Family

ID=77020603

Family Applications (1)

Application Number Title Priority Date Filing Date
TW108137183A TWI730452B (en) 2019-10-16 2019-10-16 Stereo artificial neural network system

Country Status (1)

Country Link
TW (1) TWI730452B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104966104A (en) * 2015-06-30 2015-10-07 孙建德 Three-dimensional convolutional neural network based video classifying method
US9460048B2 (en) * 2005-03-28 2016-10-04 Gerald George Pechanek Methods and apparatus for creating and executing a packet of instructions organized according to data dependencies between adjacent instructions and utilizing networks based on adjacencies to transport data in response to execution of the instructions
CN107766816A (en) * 2017-10-18 2018-03-06 河海大学 A kind of Mechanical Failure of HV Circuit Breaker recognition methods based on LVQ neutral nets
KR20180108501A (en) * 2017-03-24 2018-10-04 (주)제이엘케이인스펙션 Apparatus and method for analyzing images using semi 3d deep neural network
CN110263735A (en) * 2019-06-25 2019-09-20 北京林业大学 A method of tree species classification being carried out to artificial forest high-spectral data using Three dimensional convolution neural network

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9460048B2 (en) * 2005-03-28 2016-10-04 Gerald George Pechanek Methods and apparatus for creating and executing a packet of instructions organized according to data dependencies between adjacent instructions and utilizing networks based on adjacencies to transport data in response to execution of the instructions
CN104966104A (en) * 2015-06-30 2015-10-07 孙建德 Three-dimensional convolutional neural network based video classifying method
KR20180108501A (en) * 2017-03-24 2018-10-04 (주)제이엘케이인스펙션 Apparatus and method for analyzing images using semi 3d deep neural network
CN107766816A (en) * 2017-10-18 2018-03-06 河海大学 A kind of Mechanical Failure of HV Circuit Breaker recognition methods based on LVQ neutral nets
CN110263735A (en) * 2019-06-25 2019-09-20 北京林业大学 A method of tree species classification being carried out to artificial forest high-spectral data using Three dimensional convolution neural network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
georgesale,立体神经网络,2017/11/27,CSDN中國軟體開發者網路,https://blog.csdn.net/georgesale/article/details/78647614 *

Also Published As

Publication number Publication date
TW202117703A (en) 2021-05-01

Similar Documents

Publication Publication Date Title
CN109685819B (en) Three-dimensional medical image segmentation method based on feature enhancement
Lin et al. Attribute-Aware Convolutional Neural Networks for Facial Beauty Prediction.
CN108665070A (en) Limit TS fuzzy reasoning methods based on extreme learning machine and system
CN110263236A (en) Social network user multi-tag classification method based on dynamic multi-view learning model
CN113792874A (en) Continuous learning method and device based on innate knowledge
US20190251420A1 (en) Transform for a neurosynaptic core circuit
CN108009635A (en) A kind of depth convolutional calculation model for supporting incremental update
Chen et al. A self-generating modular neural network architecture for supervised learning
TWI730452B (en) Stereo artificial neural network system
JP2020123337A (en) On-device continuous learning method and device of neural network for analyzing input data by optimal sampling of training image for smart phone, drone, ship, or military purpose, and test method and device using it
CN116339942A (en) Self-adaptive scheduling method of distributed training task based on reinforcement learning
CN110277093A (en) The detection method and device of audio signal
WO2019080844A1 (en) Data reasoning method and apparatus, and computer device
KR102535635B1 (en) Neuromorphic computing device
CN109615069B (en) Circuit structure of neural network with asynchronous transmission characteristic
JPH06203005A (en) High speed partitioned neural network and building-up method thereof
Pask et al. The conception of a shape and the evolution of a design
WO2021134519A1 (en) Device and method for realizing data synchronization in neural network inference
KR101880547B1 (en) Method for extracting a feature vector of video using similarity measure
Tahaabdulsadda et al. Acoustics recognition with expert intelligent system
Chen et al. Static correlative filter based convolutional neural network for visual question answering
TW202030647A (en) System and method for reducing computational complexity of artificial neural network
Chen et al. Hierarchical covering algorithm
Liu et al. Learning model-based F0 production through goal-directed babbling
Ling et al. [Retracted] Research on Network Layer Recursive Reduction Model Compression for Image Recognition