[go: up one dir, main page]

TWI884704B - Three-dimensional memory device - Google Patents

Three-dimensional memory device Download PDF

Info

Publication number
TWI884704B
TWI884704B TW113105154A TW113105154A TWI884704B TW I884704 B TWI884704 B TW I884704B TW 113105154 A TW113105154 A TW 113105154A TW 113105154 A TW113105154 A TW 113105154A TW I884704 B TWI884704 B TW I884704B
Authority
TW
Taiwan
Prior art keywords
neural network
network data
dimensional memory
arrays
sub
Prior art date
Application number
TW113105154A
Other languages
Chinese (zh)
Other versions
TW202533232A (en
Inventor
林榆瑄
林昱佑
Original Assignee
旺宏電子股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 旺宏電子股份有限公司 filed Critical 旺宏電子股份有限公司
Priority to TW113105154A priority Critical patent/TWI884704B/en
Application granted granted Critical
Publication of TWI884704B publication Critical patent/TWI884704B/en
Publication of TW202533232A publication Critical patent/TW202533232A/en

Links

Images

Landscapes

  • Semiconductor Memories (AREA)

Abstract

A three-dimensional memory device is provided in the present disclosure. The three-dimensional memory device comprises a plurality of word lines, a plurality of bit lines, a three-dimensional memory array, a plurality of encoding circuits and a plurality of sensing circuits. The three-dimensional memory array comprises a plurality of two-dimensional memory arrays and is configured to store a first neural network data, a second neural network data, a third neural network data and a fourth neural network data related to at least one neural network model. Each of the plurality of two-dimensional memory arrays is coupled to the plurality of word lines and the plurality of bit lines, configured to receive a first input voltage and output a first output current, and configured to receive a second input voltage and output a second output current. The plurality of encoding circuits are respectively coupled to the plurality of two-dimensional memory arrays, and are configured to generate the first input voltage and the second input voltage based on the first neural network data and the second neural network data respectively. The plurality of sensing circuits are respectively coupled to the plurality of two-dimensional memory arrays, and are configured to generate the third neural network data and the fourth neural network data based on the first output current and the second output current respectively.

Description

三維記憶體裝置Three-dimensional memory device

本揭示文件關於三維記憶體裝置內的資料儲存技術,特別是關於透過兩種不同的訊號路徑傳遞訊號,以儲存兩倍的資料的三維記憶體裝置。 This disclosure relates to data storage techniques in three-dimensional memory devices, and more particularly to three-dimensional memory devices that store twice as much data by transmitting signals via two different signal paths.

隨著記憶體的技術的演進,具有較低的單位位元成本的三維記憶體裝置已逐漸取代傳統的平面記憶體並應用於許多領域中。此外,為了改善處理器需要花費大量時間及能源從記憶體中讀取資料的現象,記憶體內運算(In-Memory-Computing,IMC)的技術也逐漸受到重視。藉由記憶體內運算技術,可以直接在記憶體中執行運算,進而提升讀取資料時的速度及效率。 With the evolution of memory technology, three-dimensional memory devices with lower bit cost have gradually replaced traditional planar memory and are applied in many fields. In addition, in order to improve the phenomenon that the processor needs to spend a lot of time and energy to read data from the memory, the technology of in-memory computing (IMC) has also gradually received attention. With the in-memory computing technology, operations can be performed directly in the memory, thereby improving the speed and efficiency of reading data.

三維記憶體裝置中包含了具有大量記憶體單元的記憶體陣列,且每個記憶體單元具有相對應的阻抗。透過對每個記憶體單元的阻抗大小的調整,三維記憶體裝置可以實現神經網路模型中的資料(即神經元)的儲存,以運用於人工智慧科技中。 The three-dimensional memory device contains a memory array with a large number of memory units, and each memory unit has a corresponding impedance. By adjusting the impedance of each memory unit, the three-dimensional memory device can realize the storage of data (i.e. neurons) in the neural network model for use in artificial intelligence technology.

然而,隨著人工智慧科技的發展,運算過程中所需 要儲存的資料量越來越龐大。因此,如何增加三維記憶體裝置對於神經網路模型中的資料的儲存量,是本領域的課題之一。 However, with the development of artificial intelligence technology, the amount of data required to be stored in the computing process is getting larger and larger. Therefore, how to increase the storage capacity of three-dimensional memory devices for data in neural network models is one of the topics in this field.

本揭示文件的一個態樣提供一種三維記憶體裝置,包含多個字元線、多個位元線、三維記憶體陣列、多個編碼電路及多個感測電路。三維記憶體陣列包含多個二維記憶體陣列,用以儲存與至少一神經網路模型相關的第一神經網路資料、第二神經網路資料、第三神經網路資料及第四神經網路資料。每個二維記憶體陣列耦接至多個字元線及多個位元線,用以接收第一輸入電壓並輸出第一輸出電流,且用以接收第二輸入電壓並輸出第二輸出電流。多個編碼電路分別耦接至多個二維記憶體陣列,用以分別根據第一神經網路資料及第二神經網路資料產生第一輸入電壓及第二輸入電壓。多個感測電路分別耦接至多個二維記憶體陣列,用以分別根據第一輸出電流及第二輸出電流產生第三神經網路資料及第四神經網路資料。 One aspect of the present disclosure provides a three-dimensional memory device, comprising a plurality of word lines, a plurality of bit lines, a three-dimensional memory array, a plurality of encoding circuits, and a plurality of sensing circuits. The three-dimensional memory array comprises a plurality of two-dimensional memory arrays for storing first neural network data, second neural network data, third neural network data, and fourth neural network data associated with at least one neural network model. Each two-dimensional memory array is coupled to a plurality of word lines and a plurality of bit lines, for receiving a first input voltage and outputting a first output current, and for receiving a second input voltage and outputting a second output current. Multiple encoding circuits are respectively coupled to multiple two-dimensional memory arrays to generate a first input voltage and a second input voltage according to the first neural network data and the second neural network data. Multiple sensing circuits are respectively coupled to multiple two-dimensional memory arrays to generate a third neural network data and a fourth neural network data according to the first output current and the second output current.

在該態樣的三維記憶體裝置的一些實施例中,第一神經網路資料與該至少一神經網路模型中的第一神經網路模型中的第K層神經網路層相關,第三神經網路資料與第一神經網路模型中的第(K+1)層神經網路層相關;且第二神經網路資料與該至少一神經網路模型中的第二神經網路模型中的第M層神經網路層相關,第四神經網路資料與第 二神經網路模型中的第(M+1)層神經網路層相關。第一神經網路模型相異於第二神經網路模型,且M及K為正整數。 In some embodiments of the three-dimensional memory device of this state, the first neural network data is associated with the Kth neural network layer in the first neural network model of the at least one neural network model, and the third neural network data is associated with the (K+1)th neural network layer in the first neural network model; and the second neural network data is associated with the Mth neural network layer in the second neural network model of the at least one neural network model, and the fourth neural network data is associated with the (M+1)th neural network layer in the second neural network model. The first neural network model is different from the second neural network model, and M and K are positive integers.

在該態樣的儲存相關於不同神經網路模型的資料的三維記憶體裝置的一些實施例中,多個二維記憶體陣列中的第一二維記憶體陣列耦接於多個感測電路中的其中二者,多個二維記憶體陣列中的第二二維記憶體陣列耦接於多個編碼電路中的其中二者,且多個感測電路的該其中二者分別耦接於多個編碼電路的該其中二者,分別用以:將第一二維記憶體陣列的第三神經網路資料輸入至第二二維記憶體陣列,以作為第二二維記憶體陣列的第一神經網路資料;以及將第一二維記憶體陣列的第四神經網路資料輸入至第二二維記憶體陣列,以作為第二二維記憶體陣列的第二神經網路資料。 In some embodiments of the three-dimensional memory device storing data related to different neural network models, a first two-dimensional memory array among the plurality of two-dimensional memory arrays is coupled to two of the plurality of sensing circuits, a second two-dimensional memory array among the plurality of two-dimensional memory arrays is coupled to two of the plurality of encoding circuits, and the two of the plurality of sensing circuits are respectively coupled to The two of the multiple encoding circuits are respectively used to: input the third neural network data of the first two-dimensional memory array to the second two-dimensional memory array to serve as the first neural network data of the second two-dimensional memory array; and input the fourth neural network data of the first two-dimensional memory array to the second two-dimensional memory array to serve as the second neural network data of the second two-dimensional memory array.

在該態樣的儲存相關於不同神經網路模型的資料的三維記憶體裝置的一些實施例中,多個二維記憶體陣列皆經由多個字元線接收第一神經網路資料,經由多個位元線接收第二神經網路資料,且多個二維記憶體陣列皆經由多個位元線傳遞第三神經網路資料,經由多個字元線傳遞第四神經網路資料。 In some embodiments of the three-dimensional memory device storing data related to different neural network models, multiple two-dimensional memory arrays receive first neural network data via multiple word lines, receive second neural network data via multiple bit lines, and multiple two-dimensional memory arrays transmit third neural network data via multiple bit lines and transmit fourth neural network data via multiple word lines.

在該態樣的儲存相關於不同神經網路模型的資料的三維記憶體裝置的一些實施例中,多個二維記憶體陣列的一部份經由多個字元線接收第一神經網路資料,經由多個位元線接收第二神經網路資料,經由多個位元線傳遞第 三神經網路資料,並經由多個字元線傳遞第四神經網路資料,且多個二維記憶體陣列的另一部份經由多個位元線接收第一神經網路資料,經由多個字元線接收第二神經網路資料,經由多個字元線傳遞第三神經網路資料,並經由多個位元線傳遞第四神經網路資料。 In some embodiments of the three-dimensional memory device storing data related to different neural network models of this state, a portion of the multiple two-dimensional memory arrays receives the first neural network data via multiple word lines, receives the second neural network data via multiple bit lines, transmits the third neural network data via multiple bit lines, and transmits the fourth neural network data via multiple word lines, and another portion of the multiple two-dimensional memory arrays receives the first neural network data via multiple bit lines, receives the second neural network data via multiple word lines, transmits the third neural network data via multiple word lines, and transmits the fourth neural network data via multiple bit lines.

在該態樣的三維記憶體裝置的一些實施例中,第一神經網路資料與該至少一神經網路模型中的第一神經網路模型中的第K層神經網路層相關,第二神經網路資料相同於第三神經網路資料且與第一神經網路模型中的第(K+1)層神經網路層相關,且第四神經網路資料與第一神經網路模型中的第(K+2)層神經網路層相關。K為正整數。 In some embodiments of the three-dimensional memory device of this aspect, the first neural network data is associated with the Kth neural network layer in the first neural network model of the at least one neural network model, the second neural network data is the same as the third neural network data and is associated with the (K+1)th neural network layer in the first neural network model, and the fourth neural network data is associated with the (K+2)th neural network layer in the first neural network model. K is a positive integer.

在該態樣的儲存相關於同一神經網路模型的資料的三維記憶體裝置的一些實施例中,多個二維記憶體陣列中的第一二維記憶體陣列耦接於多個感測電路中的第一感測電路及第二感測電路,且耦接於多個編碼電路中的第一編碼電路及第二編碼電路。第一編碼電路用以接收第一神經網路資料,第一感測電路耦接至第二編碼電路,用以將第三神經網路資料作為第二神經網路資料傳遞至第一二維記憶體陣列,且第二感測電路用以傳遞第四神經網路資料至多個二維記憶體陣列中的第二二維記憶體陣列,用以作為第二二維記憶體陣列的第一神經網路資料。 In some embodiments of the three-dimensional memory device storing data related to the same neural network model of the aspect, a first two-dimensional memory array among a plurality of two-dimensional memory arrays is coupled to a first sensing circuit and a second sensing circuit among a plurality of sensing circuits, and is coupled to a first encoding circuit and a second encoding circuit among a plurality of encoding circuits. The first encoding circuit is used to receive the first neural network data, the first sensing circuit is coupled to the second encoding circuit, and is used to transmit the third neural network data as the second neural network data to the first two-dimensional memory array, and the second sensing circuit is used to transmit the fourth neural network data to the second two-dimensional memory array among the plurality of two-dimensional memory arrays, and is used as the first neural network data of the second two-dimensional memory array.

在該態樣的儲存相關於同一神經網路模型的資料的三維記憶體裝置的一些實施例中,多個二維記憶體陣列皆經由多個字元線接收第一神經網路資料,經由多個位元 線接收第二神經網路資料,且多個二維記憶體陣列皆經由多個位元線傳遞第三神經網路資料,經由多個字元線傳遞第四神經網路資料。 In some embodiments of the three-dimensional memory device storing data related to the same neural network model, multiple two-dimensional memory arrays receive first neural network data via multiple word lines, receive second neural network data via multiple bit lines, and multiple two-dimensional memory arrays transmit third neural network data via multiple bit lines and transmit fourth neural network data via multiple word lines.

在該態樣的儲存相關於同一神經網路模型的資料的三維記憶體裝置的一些實施例中,多個二維記憶體陣列的一部份經由多個字元線接收第一神經網路資料,經由多個位元線傳遞第三神經網路資料並接收第二神經網路資料,再經由多個字元線傳遞第四神經網路資料,且多個二維記憶體陣列的另一部份經由多個位元線接收第一神經網路資料,經由多個字元線傳遞第三神經網路資料並接收第二神經網路資料,再經由多個位元線傳遞第四神經網路資料。 In some embodiments of the three-dimensional memory device storing data related to the same neural network model, a portion of the multiple two-dimensional memory arrays receives the first neural network data via multiple word lines, transmits the third neural network data via multiple bit lines and receives the second neural network data, and then transmits the fourth neural network data via multiple word lines, and another portion of the multiple two-dimensional memory arrays receives the first neural network data via multiple bit lines, transmits the third neural network data via multiple word lines and receives the second neural network data, and then transmits the fourth neural network data via multiple bit lines.

本揭示文件的另一態樣提供一種三維記憶體裝置,包含多個字元線、多個位元線、三維記憶體陣列、多個編碼電路及多個感測電路。三維記憶體陣列包含多個二維記憶體陣列。多個二維記憶體陣列各自包含大小相同的多個子陣列,用以儲存與至少一神經網路模型相關的第一神經網路資料、第二神經網路資料、第三神經網路資料及第四神經網路資料。多個子陣列耦接至多個字元線及多個位元線,用以接收多個第一輸入電壓並輸出多個第一輸出電流,且用以接收多個第二輸入電壓並輸出多個第二輸出電流。多個編碼電路分別耦接至多個子陣列,用以分別根據第一神經網路資料及第二神經網路資料產生多個第一輸入電壓及多個第二輸入電壓。多個感測電路分別耦接至多個子陣列,用以分別根據多個第一輸出電流的總和及多個第二輸 出電流的總和產生第三神經網路資料及第四神經網路資料。 Another aspect of the present disclosure provides a three-dimensional memory device, comprising a plurality of word lines, a plurality of bit lines, a three-dimensional memory array, a plurality of encoding circuits, and a plurality of sensing circuits. The three-dimensional memory array comprises a plurality of two-dimensional memory arrays. The plurality of two-dimensional memory arrays each comprises a plurality of sub-arrays of the same size, for storing first neural network data, second neural network data, third neural network data, and fourth neural network data associated with at least one neural network model. The plurality of sub-arrays are coupled to the plurality of word lines and the plurality of bit lines, for receiving a plurality of first input voltages and outputting a plurality of first output currents, and for receiving a plurality of second input voltages and outputting a plurality of second output currents. Multiple encoding circuits are coupled to multiple sub-arrays respectively, and are used to generate multiple first input voltages and multiple second input voltages according to the first neural network data and the second neural network data respectively. Multiple sensing circuits are coupled to multiple sub-arrays respectively, and are used to generate third neural network data and fourth neural network data according to the sum of multiple first output currents and the sum of multiple second output currents respectively.

在該另一態樣的三維記憶體裝置的一些實施例中,第一神經網路資料與該至少一神經網路模型中的第一神經網路模型中的第K層神經網路層相關,第三神經網路資料與第一神經網路模型中的第(K+1)層神經網路層相關;且第二神經網路資料與該至少一神經網路模型中的第二神經網路模型中的第M層神經網路層相關,第四神經網路資料與第二神經網路模型中的第(M+1)層神經網路層相關。第一神經網路模型相異於第二神經網路模型,且M及K為正整數。 In some embodiments of the three-dimensional memory device of the other aspect, the first neural network data is associated with the Kth neural network layer in the first neural network model of the at least one neural network model, and the third neural network data is associated with the (K+1)th neural network layer in the first neural network model; and the second neural network data is associated with the Mth neural network layer in the second neural network model of the at least one neural network model, and the fourth neural network data is associated with the (M+1)th neural network layer in the second neural network model. The first neural network model is different from the second neural network model, and M and K are positive integers.

在該另一態樣的儲存相關於不同神經網路模型的資料的三維記憶體裝置的一些實施例中,多個子陣列中的多個第一子陣列各自耦接於多個感測電路的其中二者,多個子陣列中的多個第二子陣列各自耦接於多個編碼電路的其中二者,且多個第一子陣列所耦接的多個感測電路耦接於多個第二子陣列所耦接的多個編碼電路,用以:將多個第一子陣列的第三神經網路資料輸入至多個第二子陣列,以作為多個第二子陣列的第一神經網路資料;以及將多個第一子陣列的第四神經網路資料輸入至多個第二子陣列,以作為多個第二子陣列的第二神經網路資料。 In some embodiments of the three-dimensional memory device storing data related to different neural network models in another aspect, multiple first sub-arrays in the multiple sub-arrays are each coupled to two of the multiple sensing circuits, multiple second sub-arrays in the multiple sub-arrays are each coupled to two of the multiple encoding circuits, and the multiple sensing circuits coupled to the multiple first sub-arrays are coupled to the multiple encoding circuits coupled to the multiple second sub-arrays, so as to: input the third neural network data of the multiple first sub-arrays to the multiple second sub-arrays as the first neural network data of the multiple second sub-arrays; and input the fourth neural network data of the multiple first sub-arrays to the multiple second sub-arrays as the second neural network data of the multiple second sub-arrays.

在該另一態樣的儲存相關於不同神經網路模型的資料的三維記憶體裝置的一些實施例中,多個子陣列皆經由多個字元線接收第一神經網路資料,經由多個位元線接 收第二神經網路資料,且多個子陣列皆經由多個位元線傳遞第三神經網路資料,經由多個字元線傳遞第四神經網路資料。 In some embodiments of the three-dimensional memory device storing data related to different neural network models in another aspect, multiple sub-arrays receive first neural network data via multiple word lines, receive second neural network data via multiple bit lines, and multiple sub-arrays transmit third neural network data via multiple bit lines, and transmit fourth neural network data via multiple word lines.

在該另一態樣的儲存相關於不同神經網路模型的資料的三維記憶體裝置的一些實施例中,多個子陣列的一部份經由多個字元線接收第一神經網路資料,經由多個位元線接收第二神經網路資料,經由多個位元線傳遞第三神經網路資料,並經由多個字元線傳遞第四神經網路資料,且多個子陣列的另一部份經由多個位元線接收第一神經網路資料,經由多個字元線接收第二神經網路資料,經由多個字元線傳遞第三神經網路資料,並經由多個位元線傳遞第四神經網路資料。 In some embodiments of the three-dimensional memory device storing data related to different neural network models in another aspect, a portion of the plurality of sub-arrays receives first neural network data via a plurality of word lines, receives second neural network data via a plurality of bit lines, transmits third neural network data via a plurality of bit lines, and transmits fourth neural network data via a plurality of word lines, and another portion of the plurality of sub-arrays receives first neural network data via a plurality of bit lines, receives second neural network data via a plurality of word lines, transmits third neural network data via a plurality of word lines, and transmits fourth neural network data via a plurality of bit lines.

在該另一態樣的三維記憶體裝置的一些實施例中,第一神經網路資料與該至少一神經網路模型中的第一神經網路模型中的第K層神經網路層相關,第二神經網路資料相同於第三神經網路資料且與第一神經網路模型中的第(K+1)層神經網路層相關,且第四神經網路資料與第一神經網路模型中的第(K+2)層神經網路層相關。K為正整數。 In some embodiments of the three-dimensional memory device of the other aspect, the first neural network data is associated with the Kth neural network layer in the first neural network model of the at least one neural network model, the second neural network data is the same as the third neural network data and is associated with the (K+1)th neural network layer in the first neural network model, and the fourth neural network data is associated with the (K+2)th neural network layer in the first neural network model. K is a positive integer.

在該另一態樣的儲存相關於相同神經網路模型的資料的三維記憶體裝置的一些實施例中,多個子陣列中的多個第一子陣列耦接於多個感測電路中的多個第一感測電路及多個第二感測電路,且耦接於多個編碼電路中的多個第一編碼電路及多個第二編碼電路。多個第一編碼電路用 以接收第一神經網路資料,多個第一感測電路耦接至多個第二編碼電路,用以將第三神經網路資料作為第二神經網路資料傳遞至多個第一子陣列,且多個第二感測電路用以傳遞第四神經網路資料至多個子陣列中的多個第二子陣列,用以作為多個第二子陣列的第一神經網路資料。 In some embodiments of the three-dimensional memory device storing data related to the same neural network model in another aspect, multiple first sub-arrays in the multiple sub-arrays are coupled to multiple first sensing circuits and multiple second sensing circuits in the multiple sensing circuits, and coupled to multiple first encoding circuits and multiple second encoding circuits in the multiple encoding circuits. The multiple first encoding circuits are used to receive the first neural network data, the multiple first sensing circuits are coupled to the multiple second encoding circuits, and are used to transmit the third neural network data as the second neural network data to the multiple first sub-arrays, and the multiple second sensing circuits are used to transmit the fourth neural network data to the multiple second sub-arrays in the multiple sub-arrays as the first neural network data of the multiple second sub-arrays.

在該另一態樣的儲存相關於相同神經網路模型的資料的三維記憶體裝置的一些實施例中,多個子陣列皆經由多個字元線接收第一神經網路資料,經由多個位元線接收第二神經網路資料,且多個子陣列皆經由多個位元線傳遞第三神經網路資料,經由多個字元線傳遞第四神經網路資料。 In some embodiments of the three-dimensional memory device storing data related to the same neural network model in another aspect, multiple sub-arrays receive first neural network data via multiple word lines, receive second neural network data via multiple bit lines, and multiple sub-arrays transmit third neural network data via multiple bit lines, and transmit fourth neural network data via multiple word lines.

在該另一態樣的儲存相關於相同神經網路模型的資料的三維記憶體裝置的一些實施例中,多個子陣列的一部份經由多個字元線接收第一神經網路資料,經由多個位元線傳遞第三神經網路資料並接收第二神經網路資料,再經由多個字元線傳遞第四神經網路資料,且多個子陣列的另一部份經由多個位元線接收第一神經網路資料,經由多個字元線傳遞第三神經網路資料並接收第二神經網路資料,再經由多個位元線傳遞第四神經網路資料。 In some embodiments of the three-dimensional memory device storing data related to the same neural network model in another aspect, a portion of the plurality of sub-arrays receives the first neural network data via a plurality of word lines, transmits the third neural network data via a plurality of bit lines and receives the second neural network data, and then transmits the fourth neural network data via a plurality of word lines, and another portion of the plurality of sub-arrays receives the first neural network data via a plurality of bit lines, transmits the third neural network data via a plurality of word lines and receives the second neural network data, and then transmits the fourth neural network data via a plurality of bit lines.

在該另一態樣的三維記憶體裝置的一些實施例中,第一神經網路資料、第二神經網路資料、第三神經網路資料及第四神經網路資料彼此相異且皆與該至少一神經網路模型中的其中一者中的一神經網路層相關。 In some embodiments of the three-dimensional memory device of the other aspect, the first neural network data, the second neural network data, the third neural network data, and the fourth neural network data are different from each other and are all related to a neural network layer in one of the at least one neural network model.

在該另一態樣的儲存相關於相同神經網路模型的 相同神經網路層的資料的三維記憶體裝置的一些實施例中,多個子陣列皆經由多個字元線接收第一神經網路資料,經由多個位元線接收第二神經網路資料,且多個子陣列皆經由多個位元線傳遞第三神經網路資料,經由多個字元線傳遞第四神經網路資料。 In some embodiments of the three-dimensional memory device for storing data of the same neural network layer related to the same neural network model in another aspect, multiple sub-arrays receive first neural network data via multiple word lines, receive second neural network data via multiple bit lines, and multiple sub-arrays transmit third neural network data via multiple bit lines, and transmit fourth neural network data via multiple word lines.

透過本揭示文件的兩種態樣的三維記憶體裝置,可以在記憶體陣列中以不同方向傳遞訊號,以實現儲存兩種神經網路資料的功能,進而提高三維記憶體裝置的儲存能力。 Through the two types of three-dimensional memory devices disclosed in this document, signals can be transmitted in different directions in the memory array to realize the function of storing two types of neural network data, thereby improving the storage capacity of the three-dimensional memory device.

100:三維記憶體裝置 100: Three-dimensional memory device

110:三維記憶體陣列 110: Three-dimensional memory array

111_1~111_p:二維記憶體陣列 111_1~111_p: two-dimensional memory array

111_1A~111_1J:子陣列 111_1A~111_1J: subarray

120:編碼電路 120: Encoding circuit

130:感測電路 130: Sensing circuit

140:處理電路 140: Processing circuit

V,V1~Vn:輸入電壓 V, V1~Vn: Input voltage

I,I1~Im:輸出電流 I,I1~Im: output current

A1~An,B1~Bm:神經網路資料 A1~An,B1~Bm: neural network data

C1~Cm,D1~Dn:神經網路資料 C1~Cm,D1~Dn: neural network data

G11~G1m,G21~G2m:記憶體單元 G11~G1m,G21~G2m: memory unit

Gn1~Gnm:記憶體單元 Gn1~Gnm: memory unit

W11~W1m,W21~W2m:阻抗/權重 W11~W1m,W21~W2m: Impedance/weight

Wn1~Wnm:阻抗/權重 Wn1~Wnm: Impedance/weight

WL1~WLn:字元線 WL1~WLn: character line

BL1~BLm:位元線 BL1~BLm: bit line

X,Y,Z:方向 X,Y,Z: Direction

為使本揭示文件之上述和其他目的、特徵、優點與實施例能更明顯易懂,所附圖式之說明如下:第1圖為根據本揭示文件的一些實施例所繪示的三維記憶體裝置的立體示意圖;第2A圖為根據一些實例所繪示的編碼電路、感測電路及二維記憶體陣列的示意圖;第2B圖為根據本揭示文件的一些實施例所繪示的二維記憶體陣列的內部結構及電流路徑的示意圖;第2C圖為根據本揭示文件的一些實施例所繪示的神經網路的示意圖;第2D圖為根據本揭示文件的一些實施例所繪示的二維記憶體陣列的內部結構及電流路徑的示意圖;第3A圖為根據本揭示文件的一些實施例所繪示的二維 記憶體陣列的電路圖;第3B圖為根據本揭示文件的另一些實施例所繪示的二維記憶體陣列的電路圖;第3C圖為根據本揭示文件的又一些實施例所繪示的二維記憶體陣列的電路圖;第4A圖為根據本揭示文件的一些實施例所繪示的二維記憶體陣列儲存神經網路資料的示意圖;第4B圖為根據本揭示文件的另一些實施例所繪示的二維記憶體陣列儲存神經網路資料的示意圖;第4C圖為根據本揭示文件的又一些實施例所繪示的二維記憶體陣列儲存神經網路資料的示意圖;第5圖為根據本揭示文件的一些實施例所繪示的二維記憶體陣列與子陣列之間的關係的示意圖;第6A圖為根據本揭示文件的一些實施例所繪示的子陣列儲存神經網路資料的示意圖;第6B圖為根據本揭示文件的另一些實施例所繪示的子陣列儲存神經網路資料的示意圖;第6C圖為根據本揭示文件的又一些實施例所繪示的子陣列儲存神經網路資料的示意圖;以及第6D圖為根據本揭示文件的再一些實施例所繪示的子陣列儲存神經網路資料的示意圖。 In order to make the above and other objects, features, advantages and embodiments of the present disclosure more clearly understandable, the attached drawings are described as follows: FIG. 1 is a three-dimensional schematic diagram of a three-dimensional memory device according to some embodiments of the present disclosure; FIG. 2A is a schematic diagram of an encoding circuit, a sensing circuit and a two-dimensional memory array according to some embodiments; FIG. 2B is a schematic diagram of the internal structure and current path of a two-dimensional memory array according to some embodiments of the present disclosure; FIG. 2C is a schematic diagram of a memory device according to some embodiments of the present disclosure; FIG2D is a schematic diagram of the internal structure and current path of a two-dimensional memory array according to some embodiments of the present disclosure document; FIG3A is a circuit diagram of a two-dimensional memory array according to some embodiments of the present disclosure document; FIG3B is a circuit diagram of a two-dimensional memory array according to other embodiments of the present disclosure document; FIG3C is a circuit diagram of a two-dimensional memory array according to still other embodiments of the present disclosure document. FIG. 4A is a schematic diagram of a two-dimensional memory array storing neural network data according to some embodiments of the present disclosure; FIG. 4B is a schematic diagram of a two-dimensional memory array storing neural network data according to other embodiments of the present disclosure; FIG. 4C is a schematic diagram of a two-dimensional memory array storing neural network data according to still other embodiments of the present disclosure; FIG. 5 is a schematic diagram of a two-dimensional memory array and a sub-array according to some embodiments of the present disclosure. 6A is a schematic diagram of a sub-array storing neural network data according to some embodiments of the present disclosure; 6B is a schematic diagram of a sub-array storing neural network data according to other embodiments of the present disclosure; 6C is a schematic diagram of a sub-array storing neural network data according to still other embodiments of the present disclosure; and 6D is a schematic diagram of a sub-array storing neural network data according to still other embodiments of the present disclosure.

以下將配合相關圖式來說明本揭示文件的實施例。 在圖式中,相同的標號表示相同或類似的元件或方法流程。 The following will be used in conjunction with the relevant figures to illustrate the embodiments of the present disclosure document. In the figures, the same reference numerals represent the same or similar elements or method flows.

於本揭示文件中,當一元件被稱為「連接」時,可指「電性連接」或「光連接」,當一元件被稱為「耦接」時,可指「電性耦接」或「光耦接」。「連接」或「耦接」亦可用以表示二或多個元件間相互搭配操作或互動。除非內文中對於冠詞有所特別限定,否則「一」與「該」可泛指單一個或多個。將進一步理解的是,本文中所使用之「包含」、「包括」、「具有」及相似詞彙,指明其所記載的特徵、區域、整數、步驟、操作、元件與/或組件,但不排除其所述或額外的其一個或多個其它特徵、區域、整數、步驟、操作、元件、組件與/或其中之群組。 In this disclosure document, when an element is referred to as "connected", it may refer to "electrical connection" or "optical connection", and when an element is referred to as "coupled", it may refer to "electrical coupling" or "optical coupling". "Connected" or "coupled" can also be used to indicate the coordinated operation or interaction between two or more elements. Unless the text specifically limits the article, "one" and "the" can refer to one or more. It will be further understood that the "include", "include", "have" and similar terms used in this article indicate the characteristics, regions, integers, steps, operations, elements and/or components recorded therein, but do not exclude one or more other characteristics, regions, integers, steps, operations, elements, components and/or groups thereof described therein or in addition.

第1圖為根據本揭示文件的一些實施例所繪示的三維記憶體裝置100的立體示意圖。在一些實施例中,三維記憶體裝置100包含三維記憶體陣列110、多個編碼電路120、多個感測電路130、多個處理電路140、字元線WL1~WLn及位元線BL1~BLm。應注意,為了圖式的簡潔起見,第1圖中省略了字元線WL1~WLn及位元線BL1~BLm(將於後續段落及圖式中說明),且僅繪示一個編碼電路120、一個感測電路130及一個處理電路140。 FIG. 1 is a three-dimensional schematic diagram of a three-dimensional memory device 100 according to some embodiments of the present disclosure. In some embodiments, the three-dimensional memory device 100 includes a three-dimensional memory array 110, a plurality of encoding circuits 120, a plurality of sensing circuits 130, a plurality of processing circuits 140, word lines WL1~WLn and bit lines BL1~BLm. It should be noted that for the sake of simplicity of the diagram, word lines WL1~WLn and bit lines BL1~BLm are omitted in FIG. 1 (to be described in subsequent paragraphs and diagrams), and only one encoding circuit 120, one sensing circuit 130 and one processing circuit 140 are shown.

三維記憶體陣列110耦接於編碼電路120及感測電路130之間,用以自編碼電路120接收輸入電壓V並將輸出電流I傳遞至感測電路130。在一些實施例中,三 維記憶體陣列110包含二維記憶體陣列111_1~111_p,其中p為正整數。二維記憶體陣列111_1~111_p的平面沿著一平面方向(例如,第1圖中的方向X、Z所構成的平面方向)延伸,且沿著另一特定方向(例如,第1圖中的方向Y)排列,以共同形成一個立體結構。 The three-dimensional memory array 110 is coupled between the encoding circuit 120 and the sensing circuit 130 to receive the input voltage V from the encoding circuit 120 and transmit the output current I to the sensing circuit 130. In some embodiments, the three-dimensional memory array 110 includes two-dimensional memory arrays 111_1~111_p, where p is a positive integer. The planes of the two-dimensional memory arrays 111_1~111_p extend along a plane direction (for example, the plane direction formed by the directions X and Z in Figure 1) and are arranged along another specific direction (for example, the direction Y in Figure 1) to form a three-dimensional structure together.

在一些實施例中,三維記憶體陣列110可以由揮發性(volatile)記憶體(例如,動態隨機存取記憶體(Dynamic Random Access Memory,DRAM)、靜態隨機存取記憶體(Static Random Access Memory,SRAM))、非揮發性記憶體(例如,磁阻式隨機存取記憶體(Magnetoresistive Random Access Memory,MRAM)、鐵電隨機存取記憶體(Ferroelectric Random Access Memory,FeRAM))或上述之組合實現。 In some embodiments, the three-dimensional memory array 110 can be implemented by volatile memory (e.g., dynamic random access memory (DRAM), static random access memory (SRAM)), non-volatile memory (e.g., magnetoresistive random access memory (MRAM), ferroelectric random access memory (FeRAM)), or a combination thereof.

在一些實施例中,二維記憶體陣列111_1~111_p各自包含多個記憶體單元(例如,第1圖中繪示的多個電晶體)。關於記憶體單元的內部結構,將在後續段落中詳細說明。 In some embodiments, the two-dimensional memory arrays 111_1~111_p each include a plurality of memory cells (e.g., a plurality of transistors shown in FIG. 1). The internal structure of the memory cell will be described in detail in the following paragraphs.

編碼電路120耦接至三維記憶體陣列110中的二維記憶體陣列111_1~111_p以及處理電路140,用以自處理電路140接收神經網路資料A1~An、B1~Bm,並根據神經網路資料A1~An、B1~Bm傳遞對應的輸入電壓V至二維記憶體陣列111_1~111_p。 The encoding circuit 120 is coupled to the two-dimensional memory arrays 111_1~111_p and the processing circuit 140 in the three-dimensional memory array 110, and is used to receive the neural network data A1~An, B1~Bm from the processing circuit 140, and transmit the corresponding input voltage V to the two-dimensional memory arrays 111_1~111_p according to the neural network data A1~An, B1~Bm.

感測電路130耦接至三維記憶體陣列110中的二維記憶體陣列111_1~111_p以及處理電路140,用以自 二維記憶體陣列111_1~111_p接收輸出電流I,並根據電流I計算出對應的神經網路資料C1~Cm、D1~Dn,再將神經網路資料C1~Cm、D1~Dn傳遞至處理電路140。 The sensing circuit 130 is coupled to the two-dimensional memory arrays 111_1~111_p in the three-dimensional memory array 110 and the processing circuit 140, and is used to receive the output current I from the two-dimensional memory arrays 111_1~111_p, and calculate the corresponding neural network data C1~Cm, D1~Dn according to the current I, and then transmit the neural network data C1~Cm, D1~Dn to the processing circuit 140.

處理電路140耦接至編碼電路120及感測電路130,用以將神經網路資料A1~An、B1~Bm傳遞至編碼電路120,並自感測電路130接收神經網路資料C1~Cm、D1~Dn。 The processing circuit 140 is coupled to the encoding circuit 120 and the sensing circuit 130 to transmit the neural network data A1~An, B1~Bm to the encoding circuit 120 and receive the neural network data C1~Cm, D1~Dn from the sensing circuit 130.

由於二維記憶體陣列111_1~111_p耦接至每個編碼電路120及每個感測電路130的方式相似,為了簡潔起見,第2A圖僅針對二維記憶體陣列111_1與編碼電路120、感測電路130之間的連接關係進行說明。第2A圖為根據一些實例所繪示的編碼電路120、感測電路130及二維記憶體陣列111_1的示意圖。在一些實施例中,二維記憶體陣列111_1包含記憶體單元G11~G1m、G21~G2m、...、Gn1~Gnm,且記憶體單元G11~G1m、G21~G2m、...、Gn1~Gnm排列為具有m個行及n個列的方形陣列,其中m及n為正整數。 Since the two-dimensional memory arrays 111_1 to 111_p are coupled to each encoding circuit 120 and each sensing circuit 130 in a similar manner, for the sake of simplicity, FIG. 2A only illustrates the connection relationship between the two-dimensional memory array 111_1 and the encoding circuit 120 and the sensing circuit 130. FIG. 2A is a schematic diagram of the encoding circuit 120, the sensing circuit 130 and the two-dimensional memory array 111_1 according to some examples. In some embodiments, the two-dimensional memory array 111_1 includes memory cells G11~G1m, G21~G2m, ..., Gn1~Gnm, and the memory cells G11~G1m, G21~G2m, ..., Gn1~Gnm are arranged in a square array with m rows and n columns, where m and n are positive integers.

如第2A圖所示,編碼電路120在接收到神經網路資料A1~An後,會根據神經網路資料A1~An產生對應的輸入電壓V1~Vn至二維記憶體陣列111_1。接著,二維記憶體陣列111_1會產生輸出電流I1~Im至感測電路130,感測電路130在接收到輸出電流I1~Im後,會根據輸出電流I1~Im產生對應的神經網路資料C1~Cm。 As shown in Figure 2A, after receiving the neural network data A1~An, the encoding circuit 120 will generate corresponding input voltages V1~Vn to the two-dimensional memory array 111_1 according to the neural network data A1~An. Then, the two-dimensional memory array 111_1 will generate output currents I1~Im to the sensing circuit 130. After receiving the output currents I1~Im, the sensing circuit 130 will generate corresponding neural network data C1~Cm according to the output currents I1~Im.

關於二維記憶體陣列111_1的內部結構及電流路 徑,請參照第2B圖。第2B圖為根據本揭示文件的一些實施例所繪示的二維記憶體陣列111_1的內部結構及電流路徑的示意圖。 For the internal structure and current path of the two-dimensional memory array 111_1, please refer to FIG. 2B. FIG. 2B is a schematic diagram of the internal structure and current path of the two-dimensional memory array 111_1 according to some embodiments of the present disclosure.

在一些實施例中,記憶體單元G11~G1m、G21~G2m、...、Gn1~Gnm分別具有阻抗W11~W1m、W21~W2m、...、Wn1~Wnm。在操作上,當輸入電壓V1~Vn分別自字元線WL1~WLn輸入至二維記憶體陣列111_1後,二維記憶體陣列111_1會根據輸入電壓V1(對應於神經網路資料A1)及記憶體單元G11的阻抗W11在位元線BL1產生輸出電流I1的一部份,根據輸入電壓V2(對應於神經網路資料A2)及記憶體單元G21的阻抗W21在位元線BL1產生輸出電流I1的另一部份,以此類推。因此,輸入電壓V1~Vn(對應於神經網路資料A1~An)的輸入會在位元線BL1產生輸出電流I1的n個部份,而此n個部分的總和即為輸出電流I1(對應於神經網路資料C1)。 In some embodiments, the memory cells G11-G1m, G21-G2m, ..., Gn1-Gnm have impedances W11-W1m, W21-W2m, ..., Wn1-Wnm, respectively. In operation, when the input voltages V1~Vn are respectively input from the word lines WL1~WLn to the two-dimensional memory array 111_1, the two-dimensional memory array 111_1 will generate a part of the output current I1 on the bit line BL1 according to the input voltage V1 (corresponding to the neural network data A1) and the impedance W11 of the memory cell G11, and will generate another part of the output current I1 on the bit line BL1 according to the input voltage V2 (corresponding to the neural network data A2) and the impedance W21 of the memory cell G21, and so on. Therefore, the input voltage V1~Vn (corresponding to the neural network data A1~An) will generate n parts of the output current I1 on the bit line BL1, and the sum of these n parts is the output current I1 (corresponding to the neural network data C1).

相似地,二維記憶體陣列111_1會根據輸入電壓V1(對應於神經網路資料A1)及記憶體單元G12的阻抗W12在位元線BL2產生輸出電流I2的一部份,根據輸入電壓V2(對應於神經網路資料A2)及記憶體單元G22的阻抗W22在位元線BL2產生輸出電流I2的另一部份,以此類推。因此,輸入電壓V1~Vn(對應於神經網路資料A1~An)的輸入會在位元線BL2產生輸出電流I2的n個部份,而此n個部分的總和即為輸出電流I2(對應於神經 網路資料C2)。輸出電流I3~Im(對應於神經網路資料C3~Cm)的產生方式相似於輸出電流I1、I2,為了簡潔起見,在此不重複贅述。 Similarly, the two-dimensional memory array 111_1 generates a portion of the output current I2 on the bit line BL2 according to the input voltage V1 (corresponding to the neural network data A1) and the impedance W12 of the memory cell G12, and generates another portion of the output current I2 on the bit line BL2 according to the input voltage V2 (corresponding to the neural network data A2) and the impedance W22 of the memory cell G22, and so on. Therefore, the input of the input voltage V1~Vn (corresponding to the neural network data A1~An) will generate n portions of the output current I2 on the bit line BL2, and the sum of these n portions is the output current I2 (corresponding to the neural network data C2). The generation method of output currents I3~Im (corresponding to neural network data C3~Cm) is similar to that of output currents I1 and I2. For the sake of brevity, they will not be repeated here.

輸入電壓V1~Vn、阻抗W11~W1m、W21~W2m、...、Wn1~Wnm及輸出電流I1~Im之間的對應關係可以用於實現神經網路模型中的兩個相鄰神經網路層之間的計算。請一併參照第2B圖及第2C圖,第2C圖為根據本揭示文件的一些實施例所繪示的神經網路的示意圖。 The correspondence between input voltage V1~Vn, impedance W11~W1m, W21~W2m, ..., Wn1~Wnm and output current I1~Im can be used to implement the calculation between two adjacent neural network layers in the neural network model. Please refer to Figure 2B and Figure 2C together. Figure 2C is a schematic diagram of a neural network drawn according to some embodiments of this disclosure document.

在第2C圖的實施例中,神經網路資料A1~An被儲存於第K層神經網路層的神經元中,神經網路資料C1~Cm被儲存於第(K+1)層神經網路層的神經元中。當前層的一神經元之神經網路資料為前一層的所有神經元之神經網路資料分別與對應的權重相乘後的總合。舉例而言,在第2C圖的實施例中,神經網路資料C1為神經網路資料A1~An分別乘以權重W11、W21、...、Wn1後的總合;神經網路資料C2為神經網路資料A1~An分別乘以權重W12、W22、...、Wn2(為了簡潔起見而未繪示)後的總合,以此類推。因此,神經網路資料C1~Cm的計算可以下列的<公式一>表示:

Figure 113105154-A0305-12-0015-1
In the embodiment of FIG. 2C , neural network data A1-An are stored in neurons of the Kth neural network layer, and neural network data C1-Cm are stored in neurons of the (K+1)th neural network layer. The neural network data of a neuron in the current layer is the sum of the neural network data of all neurons in the previous layer multiplied by the corresponding weights. For example, in the embodiment of FIG. 2C, the neural network data C1 is the sum of the neural network data A1~An multiplied by weights W11, W21, ..., Wn1 respectively; the neural network data C2 is the sum of the neural network data A1~An multiplied by weights W12, W22, ..., Wn2 (not shown for simplicity), and so on. Therefore, the calculation of the neural network data C1~Cm can be expressed by the following <Formula 1>:
Figure 113105154-A0305-12-0015-1

由於第2B圖與第2C圖中對於神經網路資料C1~Cm的計算方式相同,因此第2B圖中的阻抗W11、W21、...、Wn1可以視為實現了第2C圖的神經網路模型 中的權重W11、W21、...、Wn1,進而實現了儲存第K層神經網路層中的神經網路資料的功能。 Since the calculation method for the neural network data C1~Cm in Figure 2B and Figure 2C is the same, the impedances W11, W21, ..., Wn1 in Figure 2B can be regarded as realizing the weights W11, W21, ..., Wn1 in the neural network model of Figure 2C, thereby realizing the function of storing the neural network data in the Kth neural network layer.

在一些實施例中,二維記憶體陣列111_1除了可以透過字元線WL1~WLn接收神經網路資料並透過位元線BL1~BLm輸出神經網路資料之外,也可以透過位元線BL1~BLm接收神經網路資料並透過字元線WL1~WLn輸出神經網路資料。請參照第2D圖,第2D圖為根據本揭示文件的一些實施例所繪示的二維記憶體陣列111_1的內部結構及電流路徑的示意圖。 In some embodiments, the two-dimensional memory array 111_1 can receive neural network data through word lines WL1~WLn and output neural network data through bit lines BL1~BLm, and can also receive neural network data through bit lines BL1~BLm and output neural network data through word lines WL1~WLn. Please refer to Figure 2D, which is a schematic diagram of the internal structure and current path of the two-dimensional memory array 111_1 according to some embodiments of the present disclosure document.

在第2D圖的實施例中,二維記憶體陣列111_1會自位元線BL1接收神經網路資料B1,並根據神經網路資料B1及記憶體單元G11的阻抗W11在字元線WL1產生神經網路資料D1的一部份;自位元線BL2接收神經網路資料B2,並根據神經網路資料B2及記憶體單元G12的阻抗W12在字元線WL1產生神經網路資料D1的另一部份,以此類推。因此,神經網路資料B1~Bm的輸入會在字元線WL1產生神經網路資料D1的m個部份,而此m個部分的總和即為神經網路資料D1,以此類推。因此,神經網路資料Dn可以根據位元線BL1~BLm所接收的神經網路資料B1~Bm及每個神經網路資料在字元線WLn的路徑上的阻抗計算而得(如第2D圖所示)。 In the embodiment of FIG. 2D, the two-dimensional memory array 111_1 receives neural network data B1 from bit line BL1, and generates a portion of neural network data D1 on word line WL1 according to the neural network data B1 and the impedance W11 of memory cell G11; receives neural network data B2 from bit line BL2, and generates another portion of neural network data D1 on word line WL1 according to the neural network data B2 and the impedance W12 of memory cell G12, and so on. Therefore, the input of neural network data B1~Bm generates m portions of neural network data D1 on word line WL1, and the sum of these m portions is neural network data D1, and so on. Therefore, the neural network data Dn can be calculated based on the neural network data B1~Bm received by the bit lines BL1~BLm and the impedance of each neural network data on the path of the word line WLn (as shown in Figure 2D).

因此,與第2B圖的實施例相似,第2D圖中的神經網路資料B1~Bm、阻抗W11~W1m、W21~W2m、...、Wn1~Wnm及神經網路資料D1~Dn之間的對應關係同樣 可以用於實現神經網路模型中的兩個相鄰神經網路層之間的計算(例如,第2C圖中的神經網路模型),故神經網路資料D1~Dn的計算可以下列的<公式二>表示:

Figure 113105154-A0305-12-0017-2
Therefore, similar to the embodiment of FIG. 2B, the correspondence between the neural network data B1-Bm, impedances W11-W1m, W21-W2m, ..., Wn1-Wnm and neural network data D1-Dn in FIG. 2D can also be used to implement the calculation between two adjacent neural network layers in the neural network model (for example, the neural network model in FIG. 2C), so the calculation of the neural network data D1-Dn can be expressed by the following <Formula 2>:
Figure 113105154-A0305-12-0017-2

綜上而言,二維記憶體陣列111_1可以藉由分別以字元線/位元線來接收/輸出神經網路資料,以及分別以位元線/字元線來接收/輸出神經網路資料等兩種方式,在同一個二維記憶體陣列中儲存兩筆不同的資料。 In summary, the two-dimensional memory array 111_1 can store two different data in the same two-dimensional memory array by receiving/outputting neural network data using word lines/bit lines respectively, and receiving/outputting neural network data using bit lines/word lines respectively.

關於記憶體單元G11~G1m、G21~G2m、...、Gn1~Gnm的實現方式,請參照第3A~3C圖。第3A~3C圖為根據本揭示文件的一些不同的實施例所繪示的二維記憶體陣列111_1的電路圖。 For the implementation of memory units G11~G1m, G21~G2m, ..., Gn1~Gnm, please refer to Figures 3A~3C. Figures 3A~3C are circuit diagrams of a two-dimensional memory array 111_1 according to some different embodiments of this disclosure document.

在一些實施例中,記憶體單元G11~G1m、G21~G2m、...、Gn1~Gnm可以由橫向及直向的導線(例如,字元線及位元線)連接,以形成交點式(cross-point type)陣列。舉例而言,在第3A圖的實施例中,記憶體單元G11~G13及記憶體單元G21~G23、G31~G33(為了簡潔而未標示)分別透過橫向及直向的導線與相鄰的記憶體單元耦接,且每個記憶體單元(即交點式陣列的每個交點處)各自實現為包含一個電阻的電路。 In some embodiments, memory cells G11~G1m, G21~G2m, ..., Gn1~Gnm can be connected by horizontal and vertical wires (e.g., word lines and bit lines) to form a cross-point type array. For example, in the embodiment of FIG. 3A, memory cells G11~G13 and memory cells G21~G23, G31~G33 (not labeled for simplicity) are coupled to adjacent memory cells through horizontal and vertical wires, respectively, and each memory cell (i.e., each intersection of the cross-point array) is implemented as a circuit including a resistor.

在另一些實施例中,記憶體單元G11~G1m、G21~G2m、...、Gn1~Gnm同樣可以由橫向及直向的導線連接,且更可以透過額外的導線來控制其導通狀況,以形成邏輯反或式(NOR type)陣列。舉例而言,在第3B 圖的實施例中,記憶體單元G11~G13及記憶體單元G21~G23、G31~G33(為了簡潔而未標示)透過橫向及直向的導線與相鄰的記憶體單元耦接,且每個記憶體單元各自實現為包含一個電阻及一個電容的電路。此外,在記憶體陣列的相同列上的記憶體單元的控制端連接至一條額外導線,以控制其是否導通。 In other embodiments, memory cells G11~G1m, G21~G2m, ..., Gn1~Gnm can also be connected by horizontal and vertical wires, and their conduction states can be controlled by additional wires to form a logical NOR type array. For example, in the embodiment of FIG. 3B, memory cells G11~G13 and memory cells G21~G23, G31~G33 (not labeled for simplicity) are coupled to adjacent memory cells through horizontal and vertical wires, and each memory cell is implemented as a circuit including a resistor and a capacitor. In addition, the control terminals of the memory cells on the same column of the memory array are connected to an additional wire to control whether they are turned on or off.

與第3B圖相似,在第3C圖的實施例中,記憶體單元G11~G13及記憶體單元G21~G23、G31~G33同樣形成邏輯反或式陣列。不同之處在於,第3C圖中的每個記憶體單元各自實現為包含一個電感及一個電容的電路。 Similar to FIG. 3B, in the embodiment of FIG. 3C, memory cells G11-G13 and memory cells G21-G23, G31-G33 also form a logical anti-OR array. The difference is that each memory cell in FIG. 3C is implemented as a circuit including an inductor and a capacitor.

應注意,第3A~3C圖的記憶體單元G11~G13、G21~G23及G31~G33的實現方式僅為示例,非用以限制本揭示文件。只要記憶體單元的電路結構滿足交點式陣列或邏輯反或式陣列的條件,其他記憶體單元的實現方式均在本揭示文件的範圍內。 It should be noted that the implementation methods of the memory cells G11-G13, G21-G23 and G31-G33 in Figures 3A-3C are only examples and are not intended to limit the present disclosure. As long as the circuit structure of the memory cell meets the conditions of the crosspoint array or the logical inverse array, the implementation methods of other memory cells are within the scope of the present disclosure.

第4A圖為根據本揭示文件的一些實施例所繪示的二維記憶體陣列111_1及111_2儲存神經網路資料的示意圖。應注意,為了圖式的簡潔起見,第4A圖至第4C圖及第6A圖至第6D圖中省略了連接至每個二維記憶體陣列的字元線及位元線。當編碼電路120連接至二維記憶體陣列的左側或右側時,代表二維記憶體陣列經由字元線接收神經網路資料,當編碼電路120連接至二維記憶體陣列的上側或下側時,代表二維記憶體陣列經由位元線接收神經網路資料;當感測電路130連接至二維記憶體陣列的左 側或右側時,代表二維記憶體陣列經由字元線傳遞神經網路資料,當感測電路130連接至二維記憶體陣列的上側或下側時,代表二維記憶體陣列經由位元線傳遞神經網路資料。 FIG. 4A is a schematic diagram of two-dimensional memory arrays 111_1 and 111_2 storing neural network data according to some embodiments of the present disclosure. It should be noted that for the sake of simplicity of the diagrams, word lines and bit lines connected to each two-dimensional memory array are omitted in FIGS. 4A to 4C and 6A to 6D. When the encoding circuit 120 is connected to the left or right side of the two-dimensional memory array, it means that the two-dimensional memory array receives the neural network data via the word line. When the encoding circuit 120 is connected to the top or bottom side of the two-dimensional memory array, it means that the two-dimensional memory array receives the neural network data via the bit line. When the sensing circuit 130 is connected to the left or right side of the two-dimensional memory array, it means that the two-dimensional memory array transmits the neural network data via the word line. When the sensing circuit 130 is connected to the top or bottom side of the two-dimensional memory array, it means that the two-dimensional memory array transmits the neural network data via the bit line.

在第4A圖的實施例中,二維記憶體陣列111_1透過一組編碼電路120、感測電路130及處理電路140耦接至二維記憶體陣列111_2,以儲存與第一神經網路模型的第K層神經網路層相關的神經網路資料。詳細而言,在編碼電路120將與第一神經網路模型的第K層神經網路層相關的神經網路資料輸入至二維記憶體陣列111_1後,經過二維記憶體陣列111_1的加權及加總,感測電路130可以計算出與第一神經網路模型的第(K+1)層神經網路層相關的神經網路資料,並經由處理電路140將其作為輸入至二維記憶體陣列111_2的神經網路資料來進行後續的計算。 In the embodiment of FIG. 4A , the two-dimensional memory array 111_1 is coupled to the two-dimensional memory array 111_2 via a set of encoding circuits 120 , sensing circuits 130 , and processing circuits 140 to store neural network data associated with the Kth neural network layer of the first neural network model. In detail, after the encoding circuit 120 inputs the neural network data related to the Kth neural network layer of the first neural network model into the two-dimensional memory array 111_1, the sensing circuit 130 can calculate the neural network data related to the (K+1)th neural network layer of the first neural network model through weighting and summing of the two-dimensional memory array 111_1, and use it as neural network data input to the two-dimensional memory array 111_2 through the processing circuit 140 for subsequent calculations.

相似地,二維記憶體陣列111_2同樣透過一組編碼電路120、感測電路130及處理電路140耦接至二維記憶體陣列111_3,並以相似於二維記憶體陣列111_1的方式計算出與第一神經網路模型的第(K+2)層神經網路層相關的神經網路資料,進而得以儲存與第一神經網路模型的第(K+1)層神經網路層相關的神經網路資料。 Similarly, the two-dimensional memory array 111_2 is also coupled to the two-dimensional memory array 111_3 through a set of encoding circuits 120, sensing circuits 130 and processing circuits 140, and calculates the neural network data related to the (K+2)th neural network layer of the first neural network model in a manner similar to the two-dimensional memory array 111_1, thereby storing the neural network data related to the (K+1)th neural network layer of the first neural network model.

此外,二維記憶體陣列111_1還透過另一組編碼電路120、感測電路130及處理電路140耦接至二維記憶體陣列111_2,以儲存與第二神經網路模型的第K層神經 網路層相關的神經網路資料;二維記憶體陣列111_2還透過另一組編碼電路120、感測電路130及處理電路140耦接至二維記憶體陣列111_3,以儲存與第二神經網路模型的第(K+1)層神經網路層相關的神經網路資料。 In addition, the two-dimensional memory array 111_1 is coupled to the two-dimensional memory array 111_2 through another set of encoding circuits 120, sensing circuits 130 and processing circuits 140 to store neural network data related to the Kth neural network layer of the second neural network model; the two-dimensional memory array 111_2 is coupled to the two-dimensional memory array 111_3 through another set of encoding circuits 120, sensing circuits 130 and processing circuits 140 to store neural network data related to the (K+1)th neural network layer of the second neural network model.

因此,每個二維記憶體陣列可以透過兩種連接方式,儲存兩組資料。二維記憶體陣列111_3~111_p的連接方式及資料傳遞方式相似於二維記憶體陣列111_1、111_2,在此不重複贅述。 Therefore, each two-dimensional memory array can store two sets of data through two connection methods. The connection method and data transmission method of the two-dimensional memory arrays 111_3~111_p are similar to those of the two-dimensional memory arrays 111_1 and 111_2, and will not be repeated here.

在一些實施例中,二維記憶體陣列111_1~111_p皆經由字元線接收與第一神經網路模型相關的神經網路資料,並經由位元線傳遞與第一神經網路模型相關的神經網路資料(如第4A圖之上半部所示),以儲存第一組資料。此外,二維記憶體陣列111_1~111_p更經由位元線接收與第二神經網路模型相關的神經網路資料,並經由字元線傳遞與第二神經網路模型相關的神經網路資料(如第4A圖之下半部所示),以儲存第二組資料。 In some embodiments, the two-dimensional memory arrays 111_1 to 111_p receive neural network data related to the first neural network model via word lines, and transmit the neural network data related to the first neural network model via bit lines (as shown in the upper half of FIG. 4A ) to store the first set of data. In addition, the two-dimensional memory arrays 111_1 to 111_p further receive neural network data related to the second neural network model via bit lines, and transmit the neural network data related to the second neural network model via word lines (as shown in the lower half of FIG. 4A ) to store the second set of data.

第4B圖為根據本揭示文件的另一些實施例所繪示的二維記憶體陣列111_1及111_2儲存神經網路資料的示意圖。與第4A圖相似,第4B圖中的二維記憶體陣列111_1同樣用以儲存與第一神經網路模型的第K層神經網路層相關的神經網路資料及與第二神經網路模型的第K層神經網路層相關的神經網路資料,且二維記憶體陣列111_2同樣用以儲存與第一神經網路模型的第(K+1)層神經網路層相關的神經網路資料及與第二神經網路模型的 第(K+1)層神經網路層相關的神經網路資料。 FIG. 4B is a schematic diagram of two-dimensional memory arrays 111_1 and 111_2 storing neural network data according to other embodiments of the present disclosure. Similar to FIG. 4A, the two-dimensional memory array 111_1 in FIG. 4B is also used to store neural network data associated with the Kth neural network layer of the first neural network model and neural network data associated with the Kth neural network layer of the second neural network model, and the two-dimensional memory array 111_2 is also used to store neural network data associated with the (K+1)th neural network layer of the first neural network model and neural network data associated with the (K+1)th neural network layer of the second neural network model.

與第4A圖不同的是,在第4B圖的實施例中,二維記憶體陣列111_1~111_p中的一部分可以經由字元線接收與第一神經網路模型相關的神經網路資料,再經由位元線傳遞與第一神經網路模型相關的神經網路資料,而另一部分則可以經由位元線接收與第一神經網路模型相關的神經網路資料,再經由字元線傳遞與第一神經網路模型相關的神經網路資料,以儲存第一組資料。 Different from FIG. 4A, in the embodiment of FIG. 4B, a portion of the two-dimensional memory arrays 111_1~111_p can receive neural network data related to the first neural network model via word lines and then transmit the neural network data related to the first neural network model via bit lines, while another portion can receive neural network data related to the first neural network model via bit lines and then transmit the neural network data related to the first neural network model via word lines to store the first set of data.

舉例而言,如第4B圖之上半部所示,二維記憶體陣列111_1經由字元線接收與第一神經網路模型相關的第K層神經網路層的神經網路資料,再經由位元線傳遞與第一神經網路模型的第(K+1)層神經網路層相關的神經網路資料,而二維記憶體陣列111_2則是經由位元線接收與第一神經網路模型的第(K+1)層神經網路層相關的神經網路資料,再經由字元線傳遞與第一神經網路模型的第(K+2)層神經網路層相關的神經網路資料。 For example, as shown in the upper half of FIG. 4B, the two-dimensional memory array 111_1 receives the neural network data of the Kth neural network layer associated with the first neural network model via the word line, and then transmits the neural network data associated with the (K+1)th neural network layer of the first neural network model via the bit line, while the two-dimensional memory array 111_2 receives the neural network data associated with the (K+1)th neural network layer of the first neural network model via the bit line, and then transmits the neural network data associated with the (K+2)th neural network layer of the first neural network model via the word line.

因此,二維記憶體陣列111_1~111_p在儲存與第二神經網路模型相關的神經網路資料(即第二組資料)時,同樣可以讓一部份的二維記憶體陣列經由位元線接收神經網路資料,再經由字元線傳遞神經網路資料,並讓另一部份的二維記憶體陣列經由字元線接收神經網路資料,再經由位元線傳遞神經網路資料。 Therefore, when the two-dimensional memory arrays 111_1~111_p store the neural network data (i.e., the second set of data) related to the second neural network model, a part of the two-dimensional memory array can receive the neural network data via the bit line and then transmit the neural network data via the word line, and another part of the two-dimensional memory array can receive the neural network data via the word line and then transmit the neural network data via the bit line.

接續第4B圖所示的實施例,二維記憶體陣列111_1經由位元線接收與第二神經網路模型的第K層神經 網路層相關的神經網路資料,再經由字元線傳遞與第二神經網路模型的第(K+1)層神經網路層相關的神經網路資料,而二維記憶體陣列111_2則是經由字元線接收與第二神經網路模型的第(K+1)層神經網路層相關的神經網路資料,再經由位元線傳遞與第二神經網路模型的第(K+2)層神經網路層相關的神經網路資料。 Continuing with the embodiment shown in FIG. 4B, the two-dimensional memory array 111_1 receives neural network data related to the Kth neural network layer of the second neural network model via the bit line, and then transmits neural network data related to the (K+1)th neural network layer of the second neural network model via the word line, while the two-dimensional memory array 111_2 receives neural network data related to the (K+1)th neural network layer of the second neural network model via the word line, and then transmits neural network data related to the (K+2)th neural network layer of the second neural network model via the bit line.

應注意,雖然第4A圖、第4B圖及前文所描述的二維記憶體陣列111_1~111_p用於儲存與兩個神經網路模型中的相同神經網路層相關的神經網路資料,但本揭示文件不限於此。在一些實施例中,二維記憶體陣列111_1~111_p可以儲存與兩個神經網路模型中的不同神經網路層相關的神經網路資料。舉例而言,二維記憶體陣列111_1可以儲存與第一神經網路模型的第一層神經網路層相關及與第二神經網路模型的第五層神經網路層相關的神經網路資料。 It should be noted that although the two-dimensional memory arrays 111_1~111_p described in FIG. 4A, FIG. 4B and above are used to store neural network data associated with the same neural network layer in two neural network models, the present disclosure is not limited thereto. In some embodiments, the two-dimensional memory arrays 111_1~111_p can store neural network data associated with different neural network layers in two neural network models. For example, the two-dimensional memory array 111_1 can store neural network data associated with the first neural network layer of the first neural network model and the fifth neural network layer of the second neural network model.

此外,本揭示文件中的二維記憶體陣列111_1~111_p不限於用於儲存與兩個神經網路模型相關的神經網路資料。在一些實施例中,每個二維記憶體陣列用於儲存與一個神經網路模型中的兩個相鄰神經網路層相關的神經網路資料。 In addition, the two-dimensional memory arrays 111_1~111_p in the present disclosure are not limited to being used to store neural network data associated with two neural network models. In some embodiments, each two-dimensional memory array is used to store neural network data associated with two adjacent neural network layers in a neural network model.

請參照第4C圖,第4C圖為根據本揭示文件的又一些實施例所繪示的二維記憶體陣列111_1儲存神經網路資料的示意圖。 Please refer to Figure 4C, which is a schematic diagram of a two-dimensional memory array 111_1 storing neural network data according to some other embodiments of this disclosure document.

在第4C圖的實施例中,首先,二維記憶體陣列 111_1藉由第一組編碼電路120及感測電路130,經由字元線接收與第一神經網路模型的第K層神經網路層相關的神經網路資料,再經由位元線傳遞與第(K+1)層神經網路層相關的神經網路資料。接著,處理電路140會將與第(K+1)層神經網路層相關的神經網路資料傳遞至二維記憶體陣列111_1的第二組編碼電路120及感測電路130,使二維記憶體陣列111_1經由位元線接收與第(K+1)層神經網路層相關的神經網路資料,再經由字元線傳遞與第(K+2)層神經網路層相關的神經網路資料至二維記憶體陣列111_2。如此下來,此配置可以使一個二維記憶體陣列實現儲存與一個神經網路模型的兩個相鄰神經網路層相關的神經網路資料的功能。 In the embodiment of FIG. 4C , first, the two-dimensional memory array 111_1 receives the neural network data associated with the Kth neural network layer of the first neural network model through the word line by means of the first set of encoding circuits 120 and sensing circuits 130, and then transmits the neural network data associated with the (K+1)th neural network layer through the bit line. Next, the processing circuit 140 transmits the neural network data associated with the (K+1)th neural network layer to the second set of encoding circuits 120 and sensing circuits 130 of the two-dimensional memory array 111_1, so that the two-dimensional memory array 111_1 receives the neural network data associated with the (K+1)th neural network layer via the bit line, and then transmits the neural network data associated with the (K+2)th neural network layer to the two-dimensional memory array 111_2 via the word line. In this way, this configuration enables a two-dimensional memory array to realize the function of storing neural network data associated with two adjacent neural network layers of a neural network model.

應注意,雖然第4C圖中的二維記憶體陣列111_1被繪示為先以字元線接收神經網路資料,以位元線傳遞神經網路資料(即第一組資料),再以位元線接收神經網路資料,以字元線傳遞神經網路資料(即第二組資料),但本揭示文件不限於此。在一些實施例中,二維記憶體陣列111_1~111_p的一部份可先以字元線/位元線來接收/傳遞第一組神經網路資料,再以位元線/字元線來接收/傳遞第二組神經網路資料,另一部分可先以位元線/字元線來接收/傳遞第一組神經網路資料,再以字元線/位元線來接收/傳遞第二組神經網路資料。 It should be noted that although the two-dimensional memory array 111_1 in FIG. 4C is illustrated as first receiving the neural network data with word lines and transmitting the neural network data (i.e., the first set of data) with bit lines, and then receiving the neural network data with bit lines and transmitting the neural network data (i.e., the second set of data) with word lines, the present disclosure document is not limited to this. In some embodiments, a portion of the two-dimensional memory arrays 111_1~111_p may first receive/transmit the first set of neural network data using word lines/bit lines, and then receive/transmit the second set of neural network data using bit lines/word lines, and another portion may first receive/transmit the first set of neural network data using bit lines/word lines, and then receive/transmit the second set of neural network data using word lines/bit lines.

第5圖為根據本揭示文件的一些實施例所繪示的二維記憶體陣列111_1與子陣列111_1A~111_1J之間 的關係的示意圖。在一些實施例中,二維記憶體陣列111_1(及三維記憶體陣列110中的其他二維記憶體陣列)可以被分割成多個大小相同的子陣列來儲存神經網路資料。 FIG. 5 is a schematic diagram showing the relationship between the two-dimensional memory array 111_1 and the sub-arrays 111_1A~111_1J according to some embodiments of the present disclosure. In some embodiments, the two-dimensional memory array 111_1 (and other two-dimensional memory arrays in the three-dimensional memory array 110) can be divided into multiple sub-arrays of the same size to store neural network data.

在一些實施例中,分割出來的子陣列的大小總和可以相等於被分割的二維記憶體陣列的大小。舉例而言,子陣列111_1A~111_1D皆為4x4的陣列,其總和與8x8的二維記憶體陣列111_1相同。 In some embodiments, the sum of the sizes of the sub-arrays divided out may be equal to the size of the divided two-dimensional memory array. For example, sub-arrays 111_1A to 111_1D are all 4x4 arrays, and their sum is the same as the 8x8 two-dimensional memory array 111_1.

在另一些實施例中,分割出來的子陣列的大小總和可以大於被分割的二維記憶體陣列的大小。舉例而言,子陣列111_1E~111_1J皆為3x5的陣列,其總和大於8x8的二維記憶體陣列111_1,此時陣列中多出來的行及列所接收到的電壓會被設定為0。 In other embodiments, the sum of the sizes of the sub-arrays divided out can be larger than the size of the two-dimensional memory array being divided. For example, sub-arrays 111_1E~111_1J are all 3x5 arrays, and their sum is larger than the 8x8 two-dimensional memory array 111_1. In this case, the voltages received by the extra rows and columns in the array will be set to 0.

第6A圖為根據本揭示文件的一些實施例所繪示的子陣列111_1A~111_1C儲存神經網路資料的示意圖。在一些實施例中,第6A圖中的子陣列111_1A~111_1C用以共同實現第2A圖中的二維記憶體陣列111_1。 FIG. 6A is a schematic diagram of sub-arrays 111_1A~111_1C storing neural network data according to some embodiments of the present disclosure. In some embodiments, the sub-arrays 111_1A~111_1C in FIG. 6A are used to jointly implement the two-dimensional memory array 111_1 in FIG. 2A.

詳細而言,子陣列111_1A~111_1C各自連接一組編碼電路120及感測電路130,且此三個感測電路130連接至一個處理電路140,以將三個感測電路130輸出的神經網路資料進行加總,以得到與第一神經網路模型的第K層神經網路層相關的神經網路資料。此外,子陣列111_1A~111_1C更各自連接另一組編碼電路120及感測電路130,且此三個感測電路130連接至一個處理電路140,以將三個感測電路130輸出的神經網路資料進行加 總,以得到與第二神經網路模型的第K層神經網路層相關的神經網路資料。 In detail, each of the sub-arrays 111_1A~111_1C is connected to a set of encoding circuits 120 and sensing circuits 130, and the three sensing circuits 130 are connected to a processing circuit 140 to sum up the neural network data output by the three sensing circuits 130 to obtain neural network data related to the Kth neural network layer of the first neural network model. In addition, each of the sub-arrays 111_1A~111_1C is further connected to another set of encoding circuits 120 and sensing circuits 130, and the three sensing circuits 130 are connected to a processing circuit 140 to sum up the neural network data output by the three sensing circuits 130 to obtain neural network data related to the Kth neural network layer of the second neural network model.

與第2A圖的二維記憶體陣列111_1~111_p相似,第6A圖中的子陣列111_1A~111_1C皆經由字元線接收與第一神經網路模型相關的神經網路資料,並經由位元線傳遞與第一神經網路模型相關的神經網路資料(如第6A圖之上半部所示),以儲存第一組資料。此外,子陣列111_1A~111_1C更經由位元線接收與第二神經網路模型相關的神經網路資料,並經由字元線傳遞與第二神經網路模型相關的神經網路資料(如第4A圖之下半部所示),以儲存第二組資料。 Similar to the two-dimensional memory arrays 111_1~111_p in FIG. 2A, the sub-arrays 111_1A~111_1C in FIG. 6A receive neural network data related to the first neural network model through word lines, and transmit the neural network data related to the first neural network model through bit lines (as shown in the upper half of FIG. 6A) to store the first set of data. In addition, the sub-arrays 111_1A~111_1C further receive neural network data related to the second neural network model through bit lines, and transmit the neural network data related to the second neural network model through word lines (as shown in the lower half of FIG. 4A) to store the second set of data.

第6B圖為根據本揭示文件的另一些實施例所繪示的子陣列111_1A~111_1C儲存神經網路資料的示意圖。在一些實施例中,第6B圖中的子陣列111_1A~111_1C用以共同實現第2B圖中的二維記憶體陣列111_1。 FIG. 6B is a schematic diagram of sub-arrays 111_1A~111_1C storing neural network data according to other embodiments of the present disclosure. In some embodiments, the sub-arrays 111_1A~111_1C in FIG. 6B are used to jointly implement the two-dimensional memory array 111_1 in FIG. 2B.

詳細而言,與第4B圖相似,子陣列111_1A~111_1C中的一部分可以經由字元線接收與第一神經網路模型相關的神經網路資料,再經由位元線傳遞與第一神經網路模型相關的神經網路資料,而子陣列111_1A~111_1C的另一部分則可以經由位元線接收與第一神經網路模型相關的神經網路資料,再經由字元線傳遞與第一神經網路模型相關的神經網路資料,以共同儲存第一組資料(如第6B圖之上半部所示)。子陣列 111_1A~111_1C儲存第二組資料的方式同樣相似於第4B圖,在此不重覆贅述。 In detail, similar to FIG. 4B, a portion of the sub-arrays 111_1A~111_1C can receive neural network data related to the first neural network model via word lines, and then transmit the neural network data related to the first neural network model via bit lines, while another portion of the sub-arrays 111_1A~111_1C can receive neural network data related to the first neural network model via bit lines, and then transmit the neural network data related to the first neural network model via word lines, so as to jointly store the first set of data (as shown in the upper half of FIG. 6B). The way in which the sub-arrays 111_1A~111_1C store the second set of data is also similar to FIG. 4B, and will not be repeated here.

第6C圖為根據本揭示文件的另一些實施例所繪示的子陣列111_1A~111_1C儲存神經網路資料的示意圖。在一些實施例中,第6C圖中的子陣列111_1A~111_1C用以共同實現第2C圖中的二維記憶體陣列111_1。 FIG. 6C is a schematic diagram of sub-arrays 111_1A~111_1C storing neural network data according to other embodiments of the present disclosure. In some embodiments, the sub-arrays 111_1A~111_1C in FIG. 6C are used to jointly implement the two-dimensional memory array 111_1 in FIG. 2C.

詳細而言,與第4C圖相似,子陣列111_1A~111_1C用於儲存與一個神經網路模型的兩個相鄰神經網路層相關的神經網路資料。首先,子陣列111_1A~111_1C藉由三個編碼電路120及三個感測電路130,經由字元線接收與第K層神經網路層相關的神經網路資料,再經由位元線傳遞輸出的資料。接著,處理電路140會將這三個感測電路130的輸出進行加總,以得到與第(K+1)層神經網路層相關的神經網路資料,並將其傳遞至另外三個編碼電路120及另外三個感測電路130。因此,子陣列111_1A~111_1C可以再經由位元線接收與第(K+1)層神經網路層相關的神經網路資料,並經由字元線傳遞輸出的資料至另一處理電路140,以加總出與第(K+2)層神經網路層相關的神經網路資料。 In detail, similar to FIG. 4C , sub-arrays 111_1A to 111_1C are used to store neural network data associated with two adjacent neural network layers of a neural network model. First, sub-arrays 111_1A to 111_1C receive neural network data associated with the Kth neural network layer through three encoding circuits 120 and three sensing circuits 130 via word lines, and then transmit output data through bit lines. Then, the processing circuit 140 sums up the outputs of the three sensing circuits 130 to obtain the neural network data associated with the (K+1)th neural network layer, and transmits it to the other three encoding circuits 120 and the other three sensing circuits 130. Therefore, the sub-arrays 111_1A~111_1C can receive the neural network data associated with the (K+1)th neural network layer via the bit line, and transmit the output data to another processing circuit 140 via the word line to sum up the neural network data associated with the (K+2)th neural network layer.

此外,與第4C圖相似,在一些實施例中,子陣列111_1A~111_1C的一部份可先以字元線/位元線來接收/傳遞第一組神經網路資料,再以位元線/字元線來接收/傳遞第二組神經網路資料,另一部分可先以位元線/字元線來 接收/傳遞第一組神經網路資料,再以字元線/位元線來接收/傳遞第二組神經網路資料。 In addition, similar to FIG. 4C , in some embodiments, a portion of the sub-arrays 111_1A~111_1C may first receive/transmit the first set of neural network data using word lines/bit lines, and then receive/transmit the second set of neural network data using bit lines/word lines, and another portion may first receive/transmit the first set of neural network data using bit lines/word lines, and then receive/transmit the second set of neural network data using word lines/bit lines.

第6D圖為根據本揭示文件的再一些實施例所繪示的子陣列111_1A~111_1C儲存神經網路資料的示意圖。第6D圖相似於第6C圖,差異之處在於,第6D圖中的子陣列111_1A~111_1C可以用於儲存與一個神經網路模型的一個神經網路層的兩個部分相關的神經網路資料。換句話說,第6D圖中的子陣列111_1A~111_1C以第K層神經網路層的部位[a,b,c]作為第一組神經網路資料,並以第K層神經網路層的部位[d,e,f]作為第二組神經網路資料,以共同儲存與第K層神經網路層相關的神經網路資料。 FIG. 6D is a schematic diagram of sub-arrays 111_1A-111_1C storing neural network data according to some further embodiments of the present disclosure. FIG. 6D is similar to FIG. 6C, except that the sub-arrays 111_1A-111_1C in FIG. 6D can be used to store neural network data associated with two parts of a neural network layer of a neural network model. In other words, the sub-arrays 111_1A~111_1C in FIG. 6D use the location [a, b, c] of the Kth neural network layer as the first set of neural network data, and use the location [d, e, f] of the Kth neural network layer as the second set of neural network data to jointly store the neural network data related to the Kth neural network layer.

透過本揭示文件所提出的三維記憶體裝置100的配置,可以藉由將兩組資料分別經由字元線/位元線輸入並經由位元線/字元線輸出,來實現儲存兩組神經網路資料的功能,進而提升三維記憶體裝置100的儲存能力。 Through the configuration of the three-dimensional memory device 100 proposed in this disclosure document, the function of storing two sets of neural network data can be realized by inputting two sets of data through word lines/bit lines and outputting them through bit lines/word lines, thereby improving the storage capacity of the three-dimensional memory device 100.

以上僅為本揭示文件的較佳實施例,在不脫離本揭示文件的範圍或精神的情況下,本揭示文件的結構可以進行各種修飾和均等變化。綜上所述,凡在以下請求項的範圍內對於本揭示文件所做的修飾以及均等變化,皆為本揭示文件所涵蓋的範圍。 The above is only the preferred embodiment of this disclosure document. Without departing from the scope or spirit of this disclosure document, the structure of this disclosure document can be modified and equivalently changed in various ways. In summary, all modifications and equivalent changes made to this disclosure document within the scope of the following claims are covered by this disclosure document.

111_1,111_2:二維記憶體陣列 111_1,111_2: Two-dimensional memory array

120:編碼電路 120: Encoding circuit

130:感測電路 130: Sensing circuit

140:處理電路 140: Processing circuit

Claims (20)

一種三維記憶體裝置,包含: 多個字元線; 多個位元線; 一三維記憶體陣列,包含多個二維記憶體陣列,用以儲存與至少一神經網路模型相關的一第一神經網路資料、一第二神經網路資料、一第三神經網路資料及一第四神經網路資料, 其中每個二維記憶體陣列耦接至該多個字元線及該多個位元線,用以接收一第一輸入電壓並輸出一第一輸出電流,且用以接收一第二輸入電壓並輸出一第二輸出電流; 多個編碼電路,分別耦接至該多個二維記憶體陣列,用以分別根據該第一神經網路資料及該第二神經網路資料產生該第一輸入電壓及該第二輸入電壓;以及 多個感測電路,分別耦接至該多個二維記憶體陣列,用以分別根據該第一輸出電流及該第二輸出電流產生該第三神經網路資料及該第四神經網路資料。 A three-dimensional memory device, comprising: a plurality of word lines; a plurality of bit lines; a three-dimensional memory array, comprising a plurality of two-dimensional memory arrays, for storing a first neural network data, a second neural network data, a third neural network data and a fourth neural network data associated with at least one neural network model, wherein each two-dimensional memory array is coupled to the plurality of word lines and the plurality of bit lines, for receiving a first input voltage and outputting a first output current, and for receiving a second input voltage and outputting a second output current; A plurality of encoding circuits are coupled to the plurality of two-dimensional memory arrays, respectively, for generating the first input voltage and the second input voltage according to the first neural network data and the second neural network data; and a plurality of sensing circuits are coupled to the plurality of two-dimensional memory arrays, respectively, for generating the third neural network data and the fourth neural network data according to the first output current and the second output current. 如請求項1所述之三維記憶體裝置,其中該第一神經網路資料與該至少一神經網路模型中的一第一神經網路模型中的一第K層神經網路層相關,該第三神經網路資料與該第一神經網路模型中的一第(K+1)層神經網路層相關,且 該第二神經網路資料與該至少一神經網路模型中的一第二神經網路模型中的一第M層神經網路層相關,該第四神經網路資料與該第二神經網路模型中的一第(M+1)層神經網路層相關, 其中該第一神經網路模型相異於該第二神經網路模型,且M及K為正整數。 A three-dimensional memory device as described in claim 1, wherein the first neural network data is associated with a Kth neural network layer in a first neural network model of the at least one neural network model, the third neural network data is associated with a (K+1)th neural network layer in the first neural network model, and the second neural network data is associated with an Mth neural network layer in a second neural network model of the at least one neural network model, and the fourth neural network data is associated with a (M+1)th neural network layer in the second neural network model, wherein the first neural network model is different from the second neural network model, and M and K are positive integers. 如請求項2所述之三維記憶體裝置,其中該多個二維記憶體陣列中的一第一二維記憶體陣列耦接於該多個感測電路中的其中二者,該多個二維記憶體陣列中的一第二二維記憶體陣列耦接於該多個編碼電路中的其中二者,且 該多個感測電路的該其中二者分別耦接於該多個編碼電路的該其中二者,分別用以: 將該第一二維記憶體陣列的該第三神經網路資料輸入至該第二二維記憶體陣列,以作為該第二二維記憶體陣列的該第一神經網路資料;以及 將該第一二維記憶體陣列的該第四神經網路資料輸入至該第二二維記憶體陣列,以作為該第二二維記憶體陣列的該第二神經網路資料。 A three-dimensional memory device as described in claim 2, wherein a first two-dimensional memory array among the plurality of two-dimensional memory arrays is coupled to two of the plurality of sensing circuits, a second two-dimensional memory array among the plurality of two-dimensional memory arrays is coupled to two of the plurality of encoding circuits, and the two of the plurality of sensing circuits are respectively coupled to the two of the plurality of encoding circuits, respectively used to: input the third neural network data of the first two-dimensional memory array to the second two-dimensional memory array as the first neural network data of the second two-dimensional memory array; and The fourth neural network data of the first two-dimensional memory array is input into the second two-dimensional memory array to serve as the second neural network data of the second two-dimensional memory array. 如請求項3所述之三維記憶體裝置,其中該多個二維記憶體陣列皆經由該多個字元線接收該第一神經網路資料,經由該多個位元線接收該第二神經網路資料,且 該多個二維記憶體陣列皆經由該多個位元線傳遞該第三神經網路資料,經由該多個字元線傳遞該第四神經網路資料。 A three-dimensional memory device as described in claim 3, wherein the multiple two-dimensional memory arrays all receive the first neural network data via the multiple word lines, receive the second neural network data via the multiple bit lines, and the multiple two-dimensional memory arrays all transmit the third neural network data via the multiple bit lines, and transmit the fourth neural network data via the multiple word lines. 如請求項3所述之三維記憶體裝置,其中該多個二維記憶體陣列的一部份經由該多個字元線接收該第一神經網路資料,經由該多個位元線接收該第二神經網路資料,經由該多個位元線傳遞該第三神經網路資料,並經由該多個字元線傳遞該第四神經網路資料,且 該多個二維記憶體陣列的另一部份經由該多個位元線接收該第一神經網路資料,經由該多個字元線接收該第二神經網路資料,經由該多個字元線傳遞該第三神經網路資料,並經由該多個位元線傳遞該第四神經網路資料。 A three-dimensional memory device as described in claim 3, wherein a portion of the multiple two-dimensional memory arrays receives the first neural network data via the multiple word lines, receives the second neural network data via the multiple bit lines, transmits the third neural network data via the multiple bit lines, and transmits the fourth neural network data via the multiple word lines, and another portion of the multiple two-dimensional memory arrays receives the first neural network data via the multiple bit lines, receives the second neural network data via the multiple word lines, transmits the third neural network data via the multiple word lines, and transmits the fourth neural network data via the multiple bit lines. 如請求項1所述之三維記憶體裝置,其中該第一神經網路資料與該至少一神經網路模型中的一第一神經網路模型中的一第K層神經網路層相關,該第二神經網路資料相同於該第三神經網路資料且與該第一神經網路模型中的一第(K+1)層神經網路層相關,且該第四神經網路資料與該第一神經網路模型中的一第(K+2)層神經網路層相關,其中K為正整數。A three-dimensional memory device as described in claim 1, wherein the first neural network data is associated with a K-th neural network layer in a first neural network model of the at least one neural network model, the second neural network data is the same as the third neural network data and is associated with a (K+1)-th neural network layer in the first neural network model, and the fourth neural network data is associated with a (K+2)-th neural network layer in the first neural network model, wherein K is a positive integer. 如請求項6所述之三維記憶體裝置,其中該多個二維記憶體陣列中的一第一二維記憶體陣列耦接於該多個感測電路中的一第一感測電路及一第二感測電路,且耦接於該多個編碼電路中的一第一編碼電路及一第二編碼電路, 其中該第一編碼電路用以接收該第一神經網路資料, 該第一感測電路耦接至該第二編碼電路,用以將該第三神經網路資料作為該第二神經網路資料傳遞至該第一二維記憶體陣列,且 該第二感測電路用以傳遞該第四神經網路資料至該多個二維記憶體陣列中的一第二二維記憶體陣列,用以作為該第二二維記憶體陣列的該第一神經網路資料。 A three-dimensional memory device as described in claim 6, wherein a first two-dimensional memory array among the plurality of two-dimensional memory arrays is coupled to a first sensing circuit and a second sensing circuit among the plurality of sensing circuits, and is coupled to a first encoding circuit and a second encoding circuit among the plurality of encoding circuits, wherein the first encoding circuit is used to receive the first neural network data, the first sensing circuit is coupled to the second encoding circuit to transmit the third neural network data as the second neural network data to the first two-dimensional memory array, and The second sensing circuit is used to transmit the fourth neural network data to a second two-dimensional memory array among the multiple two-dimensional memory arrays to be used as the first neural network data of the second two-dimensional memory array. 如請求項7所述之三維記憶體裝置,其中該多個二維記憶體陣列皆經由該多個字元線接收該第一神經網路資料,經由該多個位元線接收該第二神經網路資料,且 該多個二維記憶體陣列皆經由該多個位元線傳遞該第三神經網路資料,經由該多個字元線傳遞該第四神經網路資料。 A three-dimensional memory device as described in claim 7, wherein the multiple two-dimensional memory arrays all receive the first neural network data via the multiple word lines, receive the second neural network data via the multiple bit lines, and the multiple two-dimensional memory arrays all transmit the third neural network data via the multiple bit lines, and transmit the fourth neural network data via the multiple word lines. 如請求項7所述之三維記憶體裝置,其中該多個二維記憶體陣列的一部份經由該多個字元線接收該第一神經網路資料,經由該多個位元線傳遞該第三神經網路資料並接收該第二神經網路資料,再經由該多個字元線傳遞該第四神經網路資料,且 該多個二維記憶體陣列的另一部份經由該多個位元線接收該第一神經網路資料,經由該多個字元線傳遞該第三神經網路資料並接收該第二神經網路資料,再經由該多個位元線傳遞該第四神經網路資料。 A three-dimensional memory device as described in claim 7, wherein a portion of the multiple two-dimensional memory arrays receives the first neural network data via the multiple word lines, transmits the third neural network data via the multiple bit lines and receives the second neural network data, and then transmits the fourth neural network data via the multiple word lines, and another portion of the multiple two-dimensional memory arrays receives the first neural network data via the multiple bit lines, transmits the third neural network data via the multiple word lines and receives the second neural network data, and then transmits the fourth neural network data via the multiple bit lines. 一種三維記憶體裝置,包含: 多個字元線; 多個位元線; 一三維記憶體陣列,包含多個二維記憶體陣列,其中該多個二維記憶體陣列各自包含大小相同的多個子陣列,用以儲存與至少一神經網路模型相關的一第一神經網路資料、一第二神經網路資料、一第三神經網路資料及一第四神經網路資料, 其中該多個子陣列耦接至該多個字元線及該多個位元線,用以接收多個第一輸入電壓並輸出多個第一輸出電流,且用以接收多個第二輸入電壓並輸出多個第二輸出電流; 多個編碼電路,分別耦接至該多個子陣列,用以分別根據該第一神經網路資料及該第二神經網路資料產生該多個第一輸入電壓及該多個第二輸入電壓;以及 多個感測電路,分別耦接至該多個子陣列,用以分別根據該多個第一輸出電流的總和及該多個第二輸出電流的總和產生該第三神經網路資料及該第四神經網路資料。 A three-dimensional memory device, comprising: a plurality of word lines; a plurality of bit lines; a three-dimensional memory array, comprising a plurality of two-dimensional memory arrays, wherein the plurality of two-dimensional memory arrays each comprise a plurality of sub-arrays of the same size, for storing a first neural network data, a second neural network data, a third neural network data and a fourth neural network data associated with at least one neural network model, wherein the plurality of sub-arrays are coupled to the plurality of word lines and the plurality of bit lines, for receiving a plurality of first input voltages and outputting a plurality of first output currents, and for receiving a plurality of second input voltages and outputting a plurality of second output currents; A plurality of encoding circuits, respectively coupled to the plurality of sub-arrays, for generating the plurality of first input voltages and the plurality of second input voltages according to the first neural network data and the second neural network data; and a plurality of sensing circuits, respectively coupled to the plurality of sub-arrays, for generating the third neural network data and the fourth neural network data according to the sum of the plurality of first output currents and the sum of the plurality of second output currents. 如請求項10所述之三維記憶體裝置,其中該第一神經網路資料與該至少一神經網路模型中的一第一神經網路模型中的一第K層神經網路層相關,該第三神經網路資料與該第一神經網路模型中的一第(K+1)層神經網路層相關;且 該第二神經網路資料與該至少一神經網路模型中的一第二神經網路模型中的一第M層神經網路層相關,該第四神經網路資料與該第二神經網路模型中的一第(M+1)層神經網路層相關, 其中該第一神經網路模型相異於該第二神經網路模型,且M及K為正整數。 A three-dimensional memory device as described in claim 10, wherein the first neural network data is associated with a K-th neural network layer in a first neural network model of the at least one neural network model, and the third neural network data is associated with a (K+1)-th neural network layer in the first neural network model; and the second neural network data is associated with an M-th neural network layer in a second neural network model of the at least one neural network model, and the fourth neural network data is associated with a (M+1)-th neural network layer in the second neural network model, wherein the first neural network model is different from the second neural network model, and M and K are positive integers. 如請求項11所述之三維記憶體裝置,其中該多個子陣列中的多個第一子陣列各自耦接於該多個感測電路的其中二者,該多個子陣列中的多個第二子陣列各自耦接於該多個編碼電路的其中二者,且 該多個第一子陣列所耦接的該多個感測電路耦接於該多個第二子陣列所耦接的該多個編碼電路,用以: 將該多個第一子陣列的該第三神經網路資料輸入至該多個第二子陣列,以作為該多個第二子陣列的該第一神經網路資料;以及 將該多個第一子陣列的該第四神經網路資料輸入至該多個第二子陣列,以作為該多個第二子陣列的該第二神經網路資料。 A three-dimensional memory device as described in claim 11, wherein the first sub-arrays of the plurality of sub-arrays are each coupled to two of the plurality of sensing circuits, the second sub-arrays of the plurality of sub-arrays are each coupled to two of the plurality of encoding circuits, and the sensing circuits coupled to the first sub-arrays are coupled to the encoding circuits coupled to the second sub-arrays, for: inputting the third neural network data of the first sub-arrays to the second sub-arrays as the first neural network data of the second sub-arrays; and inputting the fourth neural network data of the first sub-arrays to the second sub-arrays as the second neural network data of the second sub-arrays. 如請求項12所述之三維記憶體裝置,其中該多個子陣列皆經由該多個字元線接收該第一神經網路資料,經由該多個位元線接收該第二神經網路資料,且 該多個子陣列皆經由該多個位元線傳遞該第三神經網路資料,經由該多個字元線傳遞該第四神經網路資料。 A three-dimensional memory device as described in claim 12, wherein the plurality of sub-arrays receive the first neural network data via the plurality of word lines, receive the second neural network data via the plurality of bit lines, and the plurality of sub-arrays transmit the third neural network data via the plurality of bit lines, and transmit the fourth neural network data via the plurality of word lines. 如請求項12所述之三維記憶體裝置,其中該多個子陣列的一部份經由該多個字元線接收該第一神經網路資料,經由該多個位元線接收該第二神經網路資料,經由該多個位元線傳遞該第三神經網路資料,並經由該多個字元線傳遞該第四神經網路資料,且 該多個子陣列的另一部份經由該多個位元線接收該第一神經網路資料,經由該多個字元線接收該第二神經網路資料,經由該多個字元線傳遞該第三神經網路資料,並經由該多個位元線傳遞該第四神經網路資料。 A three-dimensional memory device as described in claim 12, wherein a portion of the plurality of sub-arrays receives the first neural network data via the plurality of word lines, receives the second neural network data via the plurality of bit lines, transmits the third neural network data via the plurality of bit lines, and transmits the fourth neural network data via the plurality of word lines, and another portion of the plurality of sub-arrays receives the first neural network data via the plurality of bit lines, receives the second neural network data via the plurality of word lines, transmits the third neural network data via the plurality of word lines, and transmits the fourth neural network data via the plurality of bit lines. 如請求項10所述之三維記憶體裝置,其中該第一神經網路資料與該至少一神經網路模型中的一第一神經網路模型中的一第K層神經網路層相關,該第二神經網路資料相同於該第三神經網路資料且與該第一神經網路模型中的一第(K+1)層神經網路層相關,且該第四神經網路資料與該第一神經網路模型中的一第(K+2)層神經網路層相關,其中K為正整數。A three-dimensional memory device as described in claim 10, wherein the first neural network data is associated with a K-th neural network layer in a first neural network model of the at least one neural network model, the second neural network data is the same as the third neural network data and is associated with a (K+1)-th neural network layer in the first neural network model, and the fourth neural network data is associated with a (K+2)-th neural network layer in the first neural network model, wherein K is a positive integer. 如請求項15所述之三維記憶體裝置,其中該多個子陣列中的多個第一子陣列耦接於該多個感測電路中的多個第一感測電路及多個第二感測電路,且耦接於該多個編碼電路中的多個第一編碼電路及多個第二編碼電路, 其中該多個第一編碼電路用以接收該第一神經網路資料, 該多個第一感測電路耦接至該多個第二編碼電路,用以將該第三神經網路資料作為該第二神經網路資料傳遞至該多個第一子陣列,且 該多個第二感測電路用以傳遞該第四神經網路資料至該多個子陣列中的多個第二子陣列,用以作為該多個第二子陣列的該第一神經網路資料。 A three-dimensional memory device as described in claim 15, wherein the first sub-arrays of the plurality of sub-arrays are coupled to the first sensing circuits and the second sensing circuits of the plurality of sensing circuits, and coupled to the first encoding circuits and the second encoding circuits of the plurality of encoding circuits, wherein the first encoding circuits are used to receive the first neural network data, the first sensing circuits are coupled to the second encoding circuits to transmit the third neural network data as the second neural network data to the first sub-arrays, and the second sensing circuits are used to transmit the fourth neural network data to the second sub-arrays of the plurality of sub-arrays as the first neural network data of the second sub-arrays. 如請求項16所述之三維記憶體裝置,其中該多個子陣列皆經由該多個字元線接收該第一神經網路資料,經由該多個位元線接收該第二神經網路資料,且 該多個子陣列皆經由該多個位元線傳遞該第三神經網路資料,經由該多個字元線傳遞該第四神經網路資料。 A three-dimensional memory device as described in claim 16, wherein the plurality of sub-arrays receive the first neural network data via the plurality of word lines, receive the second neural network data via the plurality of bit lines, and the plurality of sub-arrays transmit the third neural network data via the plurality of bit lines, and transmit the fourth neural network data via the plurality of word lines. 如請求項16所述之三維記憶體裝置,其中該多個子陣列的一部份經由該多個字元線接收該第一神經網路資料,經由該多個位元線傳遞該第三神經網路資料並接收該第二神經網路資料,再經由該多個字元線傳遞該第四神經網路資料,且 該多個子陣列的另一部份經由該多個位元線接收該第一神經網路資料,經由該多個字元線傳遞該第三神經網路資料並接收該第二神經網路資料,再經由該多個位元線傳遞該第四神經網路資料。 A three-dimensional memory device as described in claim 16, wherein a portion of the plurality of sub-arrays receives the first neural network data via the plurality of word lines, transmits the third neural network data via the plurality of bit lines and receives the second neural network data, and then transmits the fourth neural network data via the plurality of word lines, and another portion of the plurality of sub-arrays receives the first neural network data via the plurality of bit lines, transmits the third neural network data via the plurality of word lines and receives the second neural network data, and then transmits the fourth neural network data via the plurality of bit lines. 如請求項10所述之三維記憶體裝置,其中該第一神經網路資料、該第二神經網路資料、該第三神經網路資料及該第四神經網路資料彼此相異且皆與該至少一神經網路模型中的其中一者中的一神經網路層相關。A three-dimensional memory device as described in claim 10, wherein the first neural network data, the second neural network data, the third neural network data and the fourth neural network data are different from each other and are all related to a neural network layer in one of the at least one neural network model. 如請求項19所述之三維記憶體裝置,其中該多個子陣列皆經由該多個字元線接收該第一神經網路資料,經由該多個位元線接收該第二神經網路資料,且 該多個子陣列皆經由該多個位元線傳遞該第三神經網路資料,經由該多個字元線傳遞該第四神經網路資料。 A three-dimensional memory device as described in claim 19, wherein the plurality of sub-arrays receive the first neural network data via the plurality of word lines, receive the second neural network data via the plurality of bit lines, and the plurality of sub-arrays transmit the third neural network data via the plurality of bit lines, and transmit the fourth neural network data via the plurality of word lines.
TW113105154A 2024-02-07 2024-02-07 Three-dimensional memory device TWI884704B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW113105154A TWI884704B (en) 2024-02-07 2024-02-07 Three-dimensional memory device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW113105154A TWI884704B (en) 2024-02-07 2024-02-07 Three-dimensional memory device

Publications (2)

Publication Number Publication Date
TWI884704B true TWI884704B (en) 2025-05-21
TW202533232A TW202533232A (en) 2025-08-16

Family

ID=96582150

Family Applications (1)

Application Number Title Priority Date Filing Date
TW113105154A TWI884704B (en) 2024-02-07 2024-02-07 Three-dimensional memory device

Country Status (1)

Country Link
TW (1) TWI884704B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10552510B2 (en) * 2018-01-11 2020-02-04 Mentium Technologies Inc. Vector-by-matrix multiplier modules based on non-volatile 2D and 3D memory arrays
TWI704569B (en) * 2019-10-29 2020-09-11 旺宏電子股份有限公司 Integrated circuit and computing method thereof
US10825510B2 (en) * 2019-02-09 2020-11-03 Purdue Research Foundation Multi-bit dot product engine
US11094376B2 (en) * 2019-06-06 2021-08-17 Stmicroelectronics International N.V. In-memory compute array with integrated bias elements
US20210326110A1 (en) * 2020-04-16 2021-10-21 Sandisk Technologies Llc Reconfigurable input precision in-memory computing
CN114388039A (en) * 2020-10-02 2022-04-22 桑迪士克科技有限责任公司 Multi-stage ultra-low power reasoning engine accelerator
US20220358345A1 (en) * 2021-05-10 2022-11-10 Samsung Electronics Co., Ltd. Device and method with multidimensional vector neural network
US11502696B2 (en) * 2018-10-15 2022-11-15 Intel Corporation In-memory analog neural cache
TWI787691B (en) * 2019-12-24 2022-12-21 財團法人工業技術研究院 Apparatus and method for neural network computation
TW202341150A (en) * 2022-04-05 2023-10-16 台灣積體電路製造股份有限公司 Memory system and operating method of memory array
US20230395143A1 (en) * 2018-06-29 2023-12-07 Taiwan Semiconductor Manufacturing Company, Ltd. Memory computation method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10552510B2 (en) * 2018-01-11 2020-02-04 Mentium Technologies Inc. Vector-by-matrix multiplier modules based on non-volatile 2D and 3D memory arrays
US20230395143A1 (en) * 2018-06-29 2023-12-07 Taiwan Semiconductor Manufacturing Company, Ltd. Memory computation method
US11502696B2 (en) * 2018-10-15 2022-11-15 Intel Corporation In-memory analog neural cache
US10825510B2 (en) * 2019-02-09 2020-11-03 Purdue Research Foundation Multi-bit dot product engine
US11094376B2 (en) * 2019-06-06 2021-08-17 Stmicroelectronics International N.V. In-memory compute array with integrated bias elements
TWI704569B (en) * 2019-10-29 2020-09-11 旺宏電子股份有限公司 Integrated circuit and computing method thereof
TWI787691B (en) * 2019-12-24 2022-12-21 財團法人工業技術研究院 Apparatus and method for neural network computation
US20210326110A1 (en) * 2020-04-16 2021-10-21 Sandisk Technologies Llc Reconfigurable input precision in-memory computing
CN114388039A (en) * 2020-10-02 2022-04-22 桑迪士克科技有限责任公司 Multi-stage ultra-low power reasoning engine accelerator
US20220358345A1 (en) * 2021-05-10 2022-11-10 Samsung Electronics Co., Ltd. Device and method with multidimensional vector neural network
TW202341150A (en) * 2022-04-05 2023-10-16 台灣積體電路製造股份有限公司 Memory system and operating method of memory array

Also Published As

Publication number Publication date
TW202533232A (en) 2025-08-16

Similar Documents

Publication Publication Date Title
CN111338601B (en) Circuit for in-memory multiply and accumulate operation and method thereof
CN113571111B (en) Vertical mapping and computation of deep neural networks in non-volatile memory
US20200311512A1 (en) Realization of binary neural networks in nand memory arrays
US11068771B2 (en) Integrated neuro-processor comprising three-dimensional memory array
US11989646B2 (en) Neuromorphic apparatus having 3D stacked synaptic structure and memory device having the same
US20210110235A1 (en) Accelerating sparse matrix multiplication in storage class memory-based convolutional neural network inference
TW202022711A (en) Convolution accelerator using in-memory computation
KR102607860B1 (en) Neuromorphic apparatus having 3d stacked synaptic structure and memory apparatus having the same
KR20190121048A (en) Neuromorphic circuit having 3D stacked structure and Semiconductor device having the same
KR20220044643A (en) Ultralow power inference engine with external magnetic field programming assistance
US11556311B2 (en) Reconfigurable input precision in-memory computing
CN116483773B (en) An in-memory computing circuit and apparatus based on transposed DRAM cells
CN118072779B (en) Memory cell structure, control method thereof, array circuit and device, and electronic equipment
CN115019856A (en) Memory computing method and system based on RRAM multi-value storage
US20230361081A1 (en) In-memory computing circuit and fabrication method thereof
TWI884704B (en) Three-dimensional memory device
CN106251892B (en) High capacity memory
US12456525B2 (en) Three-dimensional memory device
CN218181836U (en) Memory operation device
CN114171087A (en) Memristor array structure, operation method thereof and neural network sparsification device
US20230315389A1 (en) Compute-in-memory cell
US12293804B2 (en) Convolution operation accelerator and convolution operation method
CN116997188A (en) Storage array
CN116935929A (en) Complementary storage circuits and memories
CN110390391B (en) A mapping device and method based on three-dimensional convolutional neural network