HK40001180A - Neurosynaptic unit circuit, neural network circuit and information processing system - Google Patents
Neurosynaptic unit circuit, neural network circuit and information processing system Download PDFInfo
- Publication number
- HK40001180A HK40001180A HK19124493.8A HK19124493A HK40001180A HK 40001180 A HK40001180 A HK 40001180A HK 19124493 A HK19124493 A HK 19124493A HK 40001180 A HK40001180 A HK 40001180A
- Authority
- HK
- Hong Kong
- Prior art keywords
- circuit
- neurosynaptic
- neural network
- bit line
- unit circuits
- Prior art date
Links
Description
Technical Field
Embodiments of the present disclosure relate to a neurosynaptic unit circuit, a neural network circuit, an information processing system, and an information processing method.
Background
In the technical field of neuromorphic technology, the data processing capability and the machine learning capability of a computer can be greatly improved by constructing a computing architecture similar to a human brain structure. Compared with a serial software implementation mode based on a general Central Processing Unit (CPU), the implementation mode of simulating biological nerve activity through a circuit device and a hardware system allows a large amount of parallel computation, has the advantages of high operation speed, low power consumption and the like, and becomes an important research direction in the technical field of nerve morphology.
The memristor has the characteristics of high integration level, low power consumption, continuous resistance change, non-volatility, compatibility with CMOS (complementary metal oxide semiconductor) process and the like, and is widely applied to a neural network circuit. The cross array constructed by the memristor is widely applied to multilayer neural networks, adaptive resonant networks and convolutional neural networks, and weight adjustment of the neural networks can be achieved.
Disclosure of Invention
At least one embodiment of the present disclosure provides a neurosynaptic unit circuit, including a first weight circuit and a second weight circuit, where the first weight circuit includes a first resistance change circuit and a first switch circuit, the first resistance change circuit is electrically connected to a first bit line terminal and the first switch circuit, and the first switch circuit is electrically connected to a first word line terminal, the first resistance change circuit and a source line terminal; the second weight circuit comprises a second resistance changing circuit and a second switch circuit, the second resistance changing circuit is electrically connected with the second bit line end and the second switch circuit, and the second switch circuit is electrically connected with the second word line end, the second resistance changing circuit and the source line end.
For example, in a neurosynaptic unit circuit provided by an embodiment of the present disclosure, the neurosynaptic unit is configured such that, when performing a calculation operation, a first bit line voltage applied to the first bit line terminal is greater than a source line voltage applied to the source line terminal, and a second bit line voltage applied to the second bit line terminal is less than the source line voltage.
For example, in a neurosynaptic unit circuit provided by an embodiment of the present disclosure, the first resistive switching circuit includes a first memristor, the first switching circuit includes a first transistor, a first pole of the first memristor is connected to the first bit line terminal, a second pole of the first memristor is connected to the first pole of the first transistor, a gate of the first transistor is connected to the first word line terminal, and a second pole of the first transistor is connected to the source line terminal; the second resistive switching circuit comprises a second memristor, the second switch circuit comprises a second transistor, a first pole of the second memristor is connected with the second bit line end, a second pole of the second memristor is connected with the first pole of the second transistor, a grid electrode of the second transistor is connected with the second word line end, and the second pole of the second transistor is connected with the source line end.
For example, in a neurosynaptic unit circuit provided by an embodiment of the present disclosure, the first memristor and the second memristor comprise resistance graded devices.
At least one embodiment of the present disclosure further provides a neural network circuit, including a plurality of any one of the neurosynaptic unit circuits provided in the embodiments of the present disclosure arranged in an array, the array including a plurality of rows and a plurality of columns.
For example, in a neural network circuit provided in an embodiment of the present disclosure, each row of neurosynaptic unit circuits is provided with a first bit line and a second bit line, the first bit line is electrically connected to the first resistance change circuit in the corresponding row of neurosynaptic unit circuits, and the second bit line is electrically connected to the second resistance change circuit in the corresponding row of neurosynaptic unit circuits; each row of the neurosynaptic unit circuits is correspondingly provided with a first word line, a second word line and a source line, the first word line is electrically connected with the first switch circuit in the corresponding row of the neurosynaptic unit circuits, the second word line is electrically connected with the second switch circuit in the corresponding row of the neurosynaptic unit circuits, and the source line is electrically connected with the source line in the corresponding row of the neurosynaptic unit circuits.
For example, in a neural network circuit provided in an embodiment of the present disclosure, each row of neurosynaptic unit circuits is provided with a first bit line, a second bit line, a first word line, and a second word line, where the first bit line is electrically connected to the first resistance change circuit in the corresponding row of neurosynaptic unit circuits, the second bit line is electrically connected to the second resistance change circuit in the corresponding row of neurosynaptic unit circuits, the first word line is electrically connected to the first switch circuit in the corresponding row of neurosynaptic unit circuits, and the second word line is electrically connected to the second switch circuit in the corresponding row of neurosynaptic unit circuits; and each column of the neurosynaptic unit circuits is correspondingly provided with a source line, and the source line is electrically connected with the source line terminal in the corresponding column of the neurosynaptic unit circuits.
For example, in a neural network circuit provided in an embodiment of the present disclosure, each row of neurosynaptic unit circuits is provided with a first bit line and a second bit line, the first bit line is electrically connected to the first resistance change circuit in the corresponding row of neurosynaptic unit circuits, and the second bit line is electrically connected to the second resistance change circuit in the corresponding row of neurosynaptic unit circuits; each row of the neurosynaptic unit circuits is correspondingly provided with a word line and a source line, the word line is electrically connected with the first switch circuit and the second switch circuit in the corresponding row of the neurosynaptic unit circuits, and the source line is electrically connected with the source line in the corresponding row of the neurosynaptic unit circuits.
For example, in a neural network circuit provided in an embodiment of the present disclosure, each row of neurosynaptic unit circuits is provided with a first bit line, a second bit line, and a word line, the first bit line is electrically connected to the first resistance change circuit in the corresponding row of neurosynaptic unit circuits, the second bit line is electrically connected to the second resistance change circuit in the corresponding row of neurosynaptic unit circuits, and the word line is electrically connected to the first switch circuit and the second switch circuit in the corresponding row of neurosynaptic unit circuits; and each column of the neurosynaptic unit circuits is correspondingly provided with a source line, and the source line is electrically connected with the source line terminal in the corresponding column of the neurosynaptic unit circuits.
At least one embodiment of the present disclosure further provides an information processing system, including any one of the neural network circuit control circuits, the driving circuit, and the output circuit provided by the embodiments of the present disclosure. The control circuit is configured to send a control signal to the drive circuit according to input data to be processed when calculation operation is performed; the drive circuit is configured to provide a drive voltage to the neural network circuit in accordance with the control signal; the output circuit is configured to process an output result of the neural network circuit.
For example, in an information processing system provided in an embodiment of the present disclosure, in a case where the neural network circuit includes m rows of neurosynaptic unit circuits, the data to be processed includes 1 × m of one-dimensional matrix data, and m is an integer greater than 1.
For example, in an information processing system provided by an embodiment of the present disclosure, the output circuit includes a sample-and-hold circuit and an analog-to-digital conversion circuit; the sample-and-hold circuit is configured to collect an analog current output by the neural network circuit, and the analog-to-digital conversion circuit is configured to convert the analog current to a digital current.
For example, in an information processing system provided by an embodiment of the present disclosure, the output circuit includes a sample-and-hold circuit and an analog-to-digital conversion circuit; the sample-and-hold circuit is configured to collect an analog current output by the neural network circuit and convert the analog current to an analog voltage, and the analog-to-digital conversion circuit is configured to convert the analog voltage to a digital voltage.
An embodiment of the present disclosure further provides an information processing method, which is used in the information processing system provided in the embodiment of the present disclosure, and the method includes: sending a control signal to the driving circuit according to the data to be processed; providing a driving voltage to the neural network circuit according to the control signal; and processing the output result of the neural network circuit.
For example, an information processing method provided in an embodiment of the present disclosure further includes: and initializing the neural network circuit according to a preset weight matrix.
For example, in an information processing method provided by an embodiment of the present disclosure, the initializing the neural network circuit according to a preset weight matrix includes: setting and/or resetting each neurosynaptic unit circuit in the neural network circuit; and detecting whether the conductance value of each neurosynaptic unit circuit in the neural network circuit is the same as the corresponding weight value in the preset weight matrix.
For example, an information processing method provided in an embodiment of the present disclosure further includes: if the conductance value of each neurosynaptic unit circuit in the neural network circuit is different from the corresponding weight value in the preset weight matrix, continuing the setting and/or resetting operation until the conductance value of each neurosynaptic unit circuit in the neural network circuit is the same as the corresponding weight value in the preset weight matrix.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings of the embodiments will be briefly introduced below, and it is apparent that the drawings in the following description relate only to some embodiments of the present disclosure and are not limiting to the present disclosure.
FIG. 1 is a schematic diagram of a neurosynaptic unit circuit;
FIG. 2 is a schematic diagram of a neural network circuit formed from the neurosynaptic unit circuit shown in FIG. 1;
FIG. 3 is a schematic diagram of a neurosynaptic unit circuit according to an embodiment of the present disclosure;
FIG. 4 is a circuit diagram of a neurosynaptic unit circuit according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a neural network circuit according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of another neural network circuit provided in an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a neural network circuit provided in an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a neural network circuit according to an embodiment of the present disclosure;
FIG. 9 is a schematic block diagram of an information handling system provided in an embodiment of the present disclosure;
FIG. 10 is a schematic diagram of an information handling system according to an embodiment of the present disclosure;
FIG. 11 is a schematic diagram of another information handling system provided by an embodiment of the present disclosure;
fig. 12 is a schematic diagram of an information processing method according to an embodiment of the disclosure;
fig. 13 is a schematic diagram of another information processing method according to an embodiment of the disclosure; and
fig. 14 is a schematic structural diagram of a resistance graded device according to an embodiment of the disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings of the embodiments of the present disclosure. It is to be understood that the described embodiments are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the disclosure without any inventive step, are within the scope of protection of the disclosure.
Unless otherwise defined, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this disclosure belongs. The use of "first," "second," and similar terms in this disclosure is not intended to indicate any order, quantity, or importance, but rather is used to distinguish one element from another. Also, the use of the terms "a," "an," or "the" and similar referents do not denote a limitation of quantity, but rather denote the presence of at least one. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
Fig. 1 illustrates a neurosynaptic cell circuit, as shown in fig. 1, that employs a 1T1R structure, i.e., includes a transistor M1 (which may employ, for example, a metal oxide semiconductor (CMOS) Field Effect Transistor (FET)) and a memristor R1.
The gate of the transistor M1 is connected to the word line terminal WL, for example, the transistor M1 may be connected to receive a control voltage to control the transistor M1 to turn on or off through the word line terminal WL; the source of the transistor M1 is connected to the source line terminal SL, for example, the transistor M1 may be connected to receive a reset voltage through the source line terminal SL and the source line; the drain of the transistor M1 is connected to a second pole (e.g., a cathode) of the memristor R1, and a first pole (e.g., an anode) of the memristor R1 is connected to the bit line terminal BL, e.g., the memristor R1 may be connected through the bit line terminal BL and the bit line to receive the set voltage.
The word line functions to apply a corresponding voltage to the gate of transistor M1, thereby controlling the transistor on or off. When the memristor R1 is operated, for example, a set operation or a reset operation, the transistor M1 needs to be turned on first, that is, a turn-on voltage needs to be applied to the gate of the transistor M1 through the word line terminal WL. After the transistor M1 is turned on, for example, a voltage may be applied to the memristor R1 by applying voltages to the memristor R1 at the source line terminal SL and the bit line terminal BL to change the resistance state of the memristor R1. For example, a set voltage may be applied through the bit line terminal BL to cause the memristor R1 to be in a low resistance state; for another example, a reset voltage may be applied across the source terminal SL to place the memristor R1 in a high resistance state.
It should be noted that, in the embodiment of the present disclosure, voltages are applied simultaneously through the word line terminal WL and the bit line terminal BL, so that the resistance value of the memristor R1 becomes smaller and smaller, that is, the memristor R1 changes from a high resistance state to a low resistance state, and an operation of changing the memristor R1 from the high resistance state to the low resistance state is referred to as a set operation; by applying voltages to the word line terminal WL and the source line terminal SL simultaneously, the resistance value of the memristor R1 becomes larger, namely the memristor R1 changes from a low-resistance state to a high-resistance state, and the operation of changing the memristor R1 from the low-resistance state to the high-resistance state is called a reset operation. The following embodiments are the same and will not be described again.
FIG. 2 illustrates a neural network circuit formed from a plurality of neurosynaptic unit circuits, such as the one shown in FIG. 1, e.g., an array of m rows and n columns. BL <1>, BL <2> … … BL < m > in FIG. 2 denote the bit lines of the first row, the second row … …, the mth row, respectively, and the memristor in the neurosynaptic unit circuit of each row is connected with the corresponding bit line of the row; WL <1>, WL <2> … … WL < n > in FIG. 2 denote the word lines of the first and second columns … …, respectively, the gate of the transistor in the neurosynaptic unit circuit of each column being connected to the corresponding word line of that column; in FIG. 2, SL <1>, SL <2> … … and SL < n > respectively represent source lines of a first column and a second column … …, a nth column, and the source of the transistor in the neurosynaptic unit circuit of each column is connected with the source line corresponding to the column.
The n columns and m rows of neural network circuits shown in fig. 2 may represent a n/2 column size neural network weight matrix of m rows. For example, in a calculation operation, since two neurosynaptic unit circuits are required to be combined to indicate that one of the weight matrices of the neural network has a positive or negative weight value, it is necessary to differentiate currents output on source lines electrically connected to the neurosynaptic units in even columns and odd columns. For example, assuming that the matrix data output by one neural network circuit has n elements, 2n output circuits are required to detect the currents, and n adder circuits are required to perform the difference operation on the currents.
At least one embodiment of the present disclosure provides a neurosynaptic unit circuit including a first weight circuit and a second weight circuit. The first weight circuit comprises a first resistance change circuit and a first switch circuit, the first resistance change circuit is electrically connected with a first bit line end and the first switch circuit, and the first switch circuit is electrically connected with a first word line end, the first resistance change circuit and a source line end; the second weight circuit comprises a second resistance changing circuit and a second switch circuit, the second resistance changing circuit is electrically connected with the second bit line end and the second switch circuit, and the second switch circuit is electrically connected with the second word line end, the second resistance changing circuit and the source line end. At least one embodiment of the disclosure further provides a neural network circuit, an information processing system and an information processing method corresponding to the neural synapse unit circuit.
The neural synapse unit circuit provided by the embodiment of the disclosure can indicate that one of the neural network weight matrixes has a positive weight value and a negative weight value, and the neural network circuit formed by the neural synapse unit circuit can more easily map the actual neural network weight matrix, so that the circuit resource consumption can be saved.
At least one embodiment of the present disclosure provides a neurosynaptic unit circuit 100, as shown in fig. 3, the neurosynaptic unit circuit 100 includes a first weight circuit 110 and a second weight circuit 120.
The first weight circuit 110 includes a first resistance change circuit 111 and a first switch circuit 112, the first resistance change circuit 111 is electrically connected to a first bit line terminal BL1 and the first switch circuit 112, and the first switch circuit 112 is electrically connected to a first word line terminal WL1, the first resistance change circuit 111, and a source line terminal SL. The second weight circuit 120 includes a second resistance change circuit 121 and a second switch circuit 122, the second resistance change circuit 121 is electrically connected to the second bit line terminal BL2 and the second switch circuit 122, and the second switch circuit 122 is electrically connected to the second word line terminal WL2, the second resistance change circuit 121, and the source line terminal SL.
For example, when the neurosynaptic unit circuit 100 shown in fig. 3 is used to represent a weight value, the first weight circuit 110 may represent a positive weight portion of the weight value, the second weight circuit 120 may represent a negative weight portion of the weight value, and the magnitudes of the positive weight and the negative weight may be adjusted by adjusting the first resistance change circuit 111 and the second resistance change circuit 121, respectively, so as to represent a positive weight value and a negative weight value of an arbitrary magnitude, for example, the weight value is a weight value in the weight matrix of the neural network.
For example, in performing an initialization operation, an on voltage may be applied to the first switch circuit 112 through the first word line terminal WL1 to turn on the first switch circuit 112, and then a voltage (e.g., a set voltage or a reset voltage) may be applied to the first resistance change circuit 111 through the first bit line terminal BL1 and the source line terminal SL to adjust a conductance value of the first resistance change circuit 111, so that the conductance value of the first resistance change circuit 111 may represent a positive weight portion of one weight value.
Similarly, an on voltage may be applied to the second switch circuit 122 through the second word line terminal WL2 to turn on the second switch circuit 122, and then a voltage (e.g., a set voltage or a reset voltage) may be applied to the second resistance change circuit 121 through the second bit line terminal BL2 and the source line terminal SL to adjust the conductance value of the second resistance change circuit 121, so that the conductance value of the second resistance change circuit 121 may represent a negative weight portion of one weight value.
For example, in performing the calculation operation, the first bit line voltage applied to the first bit line terminal BL1 may be made larger than the source line voltage applied to the source line terminal SL, and the second bit line voltage applied to the second bit line terminal BL2 may be made smaller than the source line voltage. With this configuration, the current flowing through the first weighting circuit 110 is shown as I1 in the figure, and the current flowing through the second weighting circuit 120 is shown as I2 in the figure, then when the current is detected at the source line terminal SL, the detected current Icell satisfies the equation: I1-I2, the first weight circuit 110 may represent a positive weight portion of a weight value and the second weight circuit 120 may represent a negative weight portion of a weight value.
It should be noted that the embodiments of the present disclosure are not limited to the above configuration, for example, when performing a calculation operation, the first bit line voltage applied to the first bit line terminal BL1 may be smaller than the source line voltage applied to the source line terminal SL, and the second bit line voltage applied to the second bit line terminal BL2 may be larger than the source line voltage. In this case, the first weight circuit 110 may represent a negative weight portion of a weight value, and the second weight circuit 120 may represent a positive weight portion of a weight value.
The neurosynaptic unit circuit 100 shown in FIG. 3 may represent one of the neural network weight matrices with positive and negative weight values. When a plurality of the neurosynaptic unit circuits 100 shown in FIG. 3 form a neural network circuit, the neural network circuit is more easily mapped to an actual neural network weight matrix, so as to complete the corresponding calculation operation; in addition, the neural network circuit does not need to be additionally provided with an adder circuit, so that the consumption of circuit resources can be saved.
In one example, the neurosynaptic unit circuit 100 shown in FIG. 3 may be implemented as the circuit structure in FIG. 4.
For example, as shown in fig. 4, the first resistive switching circuit 111 may be implemented as a first memristor R1, the first switching circuit 112 may be implemented as a first transistor M1, a first pole (e.g., a positive pole) of the first memristor R1 is connected to the first bit line terminal BL1, a second pole (e.g., a negative pole) of the first memristor R1 is connected to the first pole of the first transistor M1, a gate of the first transistor M1 is connected to the first word line terminal WL1, and a second pole of the first transistor M1 is connected to the source line terminal SL. The second resistive switching circuit 121 may be implemented as a second memristor R2, the second switching circuit 122 may be implemented as a second transistor M2, a first pole of the second memristor R2 is connected to the second bit line terminal BL2, a second pole of the second memristor R2 is connected to the first pole of the second transistor M2, a gate of the second transistor M2 is connected to the second word line terminal WL2, and the second pole of the second transistor M2 is connected to the source line terminal SL.
For example, as shown in fig. 4, in an initialization operation, a conducting voltage may be applied to the first transistor M1 through the first word line terminal WL1 to turn on the first transistor M1, and then a voltage (e.g., a set voltage or a reset voltage) may be applied to the two poles of the first memristor R1 through the first bit line terminal BL1 and the source line terminal SL to adjust the conductance value of the first memristor R1, so that the conductance value of the first memristor R1 may represent a positive weight portion of a weight value.
Similarly, a conducting voltage may be applied to the second transistor M2 through the second word line terminal WL2 to turn on the second transistor M2, and then a voltage (e.g., a set voltage or a reset voltage) may be applied to the two poles of the second memristor R2 through the second bit line terminal BL2 and the source line terminal SL to adjust the conductance value of the second memristor R2, so that the conductance value of the second memristor R2 may represent a negative weight portion of one weight value.
For example, when performing a calculation operation, a turn-on voltage may be simultaneously applied to the gates of the first transistor M1 and the second transistor M2 through the first word line terminal WL1 and the second word line terminal WL2 to turn on the first transistor M1 and the second transistor M2. While making the first bit line voltage applied to the first bit line terminal BL1 greater than the source line voltage applied to the source line terminal SL and the second bit line voltage applied to the second bit line terminal BL2 less than the source line voltage. With this configuration, the current flowing through the first memristor R1 is shown as I1 in the figure, and the current flowing through the second memristor R2 is shown as I2 in the figure, then when the current is detected at the source line terminal SL, the detected current Icell satisfies the equation: icell I1-I2, the conductance value of the first memristor R1 may represent a positive weight component of a weight value, and the conductance value of the second memristor R2 may represent a negative weight component of a weight value.
For example, the first memristor R1 and the second memristor R2 shown in fig. 4 are resistance-graded devices, so that when an initialization operation is performed, the conductance values of the first memristor R1 and the second memristor R2 can be adjusted gradually, so that the first memristor R1 and the second memristor R2 can reach corresponding conductance values more easily, and thus can represent a positive weight part and a negative weight part of a weight value respectively.
Fig. 14 is a schematic structural diagram of a resistance graded device according to an embodiment of the disclosure. For example, as shown in fig. 14, the resistance gradation type device includes a first oxide layer 12 and a second oxide layer 13 which are laminated, and the oxygen content of the first oxide layer 12 is higher than that of the second oxide layer 12. The oxygen content is the molar percentage content of oxygen in the oxide.
For example, the materials of the first oxide layer 12 and the second oxide layer 13 are both metal oxides. For example, the material of the first oxide layer 12 may be tantalum pentoxide (Ta5O2), aluminum oxide (Al2O3), or the like, and the material of the second oxide layer 13 may be tantalum dioxide (TaO2), or the like.
For example, the resistance-graded device further includes a first electrode layer 11 and a second electrode layer 14, the first oxide layer 12 and the second oxide layer 13 are disposed between the first electrode layer 11 and the second electrode layer 14, and the first electrode layer 11 is electrically connected to the first oxide layer 12, and the second electrode layer 14 is electrically connected to the second oxide layer 12.
For example, the material of the first electrode layer 11 is an active metal material, so that the resistance value of the resistance graded device is slowly changed under an applied voltage. The active metal may be, for example, aluminum (Al), nickel (Ni), titanium (Ti), or the like. The material of the second electrode layer 14 may be a conductive material such as a metal, and may be, for example, copper (Cu), aluminum (Al), tungsten (W), or the like.
In the neurosynaptic unit circuit 100 shown in fig. 4, the neurosynaptic unit circuit 100 may represent one of the positive and negative weight values in the neural network weight matrix by adjusting the conductance values of the first memristor R1 and the second memristor R2 such that the first memristor R1 and the second memristor R2 may represent the positive weight portion and the negative weight portion of one weight value, respectively.
It should be noted that the transistors used in the embodiments of the present disclosure may be thin film transistors or field effect transistors (e.g., CMOS field effect transistors) or other switching devices with the same characteristics. The source and drain of the transistor used herein may be symmetrical in structure, so that there may be no difference in structure between the source and drain. In the embodiments of the present disclosure, in order to distinguish two poles of a transistor except for a gate, one of them is directly described as a first pole, and the other is a second pole.
The embodiment of the present disclosure does not limit the type of the transistor used, for example, when the transistor is an N-type transistor, its gate is connected to the word line, for example, when a high level is input on the word line, the transistor is turned on; for example, when the transistor is a P-type transistor, the gate of the P-type transistor is connected to a word line, and the P-type transistor is turned on when a low level is input to the word line. The embodiments of the present disclosure are all illustrated by taking N-type transistors as examples, the present disclosure includes but is not limited to this, for example, one or more transistors in the embodiments of the present disclosure may also adopt P-type transistors.
At least one embodiment of the present disclosure also provides a neural network circuit 10, the neural network circuit 10 includes a plurality of neurosynaptic unit circuits 100 arranged in an array, for example, the neurosynaptic unit circuit 100 may adopt the circuit structure shown in fig. 4.
For example, in the example shown in FIG. 5, the neural network circuit 10 includes an m row by n column circuit of neurosynaptic cells 100. In FIG. 5, BLp <1>, BLp <2> … … BLp < m > represent the first bit line for the mth row of first and second rows … …, respectively; BLn <1>, BLn <2> … … BLn < m > respectively represent the second bit lines of the first and second rows … … and mth row; WLp <1>, WLp <2> … … WLp < n > denote the first word line of the first column, the second column … …, the nth column, respectively; WLn <1>, WLn <2> … … WLn < n > denote the second word line of the first column, second column … …, column n, respectively; and SL <1>, SL <2> … … and SL < n > respectively represent source lines of the first column and the second column … … and the nth column.
As shown in fig. 5, a first bit line and a second bit line are correspondingly disposed in each row of neurosynaptic unit circuits 100, and the first bit line is electrically connected to the first resistive switching circuit 111 in the corresponding row of neurosynaptic unit circuits 100 (i.e., connected to the first pole of the first memristor R1), so that the first bit line voltage can be provided to the corresponding row of neurosynaptic unit circuits 100; the second bit line is electrically connected with the second resistive switching circuit 121 in the corresponding row of neurosynaptic cell circuits 100 (i.e., connected with the first pole of the second memristor R2), such that the second bit line voltage may be provided to the corresponding row of neurosynaptic cell circuits 100.
A first word line, a second word line and a source line are correspondingly arranged in each column of the neurosynaptic unit circuit 100, the first word line is electrically connected with the first switch circuit 112 in the corresponding column of the neurosynaptic unit circuit 100 (i.e. connected with the gate of the first transistor M1), so that the voltage for making the first transistor M1 conductive can be provided for the corresponding column of the neurosynaptic unit circuit 100; the second word line is electrically connected to the second switch circuit 122 in the corresponding column of neurosynaptic cell circuits 100 (i.e., to the gate of the second transistor M2), such that a voltage may be provided to the corresponding column of neurosynaptic cell circuits 100 that causes the second transistor M2 to turn on; the source line is electrically connected to a source line terminal in the corresponding column of neurosynaptic cell circuits 100, so that a source line voltage can be supplied to the corresponding column of neurosynaptic cell circuits 100.
The neural network circuit 10 shown in fig. 5 can be used to represent the actual neural network weight matrix, thereby performing the corresponding calculation operation. For example, when the number n of columns of a neural network weight matrix is much smaller than the number m of rows (e.g., n is 32, m is 128), the number of driving circuits for driving the first word line and the second word line can be saved by using the neural network circuit 10 shown in fig. 5, and thus, the circuit resource consumption can be saved.
For another example, as shown in FIG. 6, the neural network circuit 10 also includes an m row by n column circuit of neurosynaptic units 100. In FIG. 6, BLp <1>, BLp <2> … … BLp < m > represent the first bit line for the mth row of first and second rows … …, respectively; BLn <1>, BLn <2> … … BLn < m > respectively represent the second bit lines of the first and second rows … … and mth row; WLp <1>, WLp <2> … … WLp < m > denote the first word line of the first row, the second row … …, the mth row, respectively; WLn <1>, WLn <2> … … WLn < m > denote the second word line of the first and second rows … …, respectively, row m; and SL <1>, SL <2> … … and SL < n > respectively represent source lines of the first column and the second column … … and the nth column.
As shown in fig. 6, a first bit line, a second bit line, a first word line and a second word line are correspondingly disposed in each row of neurosynaptic unit circuits 100, and the first bit line is electrically connected to the first resistive switching circuit 111 in the corresponding row of neurosynaptic unit circuits 100 (i.e., connected to the first pole of the first memristor R1), so that the first bit line voltage can be provided to the corresponding row of neurosynaptic unit circuits 100; the second bit line is electrically connected with the second resistive switching circuit 121 in the corresponding row of neurosynaptic cell circuits 100 (i.e., connected with the first pole of the second memristor R2), such that the second bit line voltage may be provided to the corresponding row of neurosynaptic cell circuits 100.
The first word line is electrically connected to the first switch circuit 112 in the corresponding row of neurosynaptic cell circuits 100 (i.e., connected to the gate of the first transistor M1), so that a voltage that turns on the first transistor M1 can be provided to the corresponding row of neurosynaptic cell circuits 100; the second word line is electrically connected to the second switch circuit 122 in the corresponding row of neurosynaptic cell circuits 100 (i.e., to the gate of the second transistor M2), such that a voltage may be provided to the corresponding row of neurosynaptic cell circuits 100 that causes the second transistor M2 to turn on.
Each column of the neurosynaptic unit circuits 100 is provided with a source line, and the source line is electrically connected to the source line terminal in the corresponding column of the neurosynaptic unit circuits 100, so that the source line voltage can be provided to the corresponding column of the neurosynaptic unit circuits 100.
The neural network circuit 10 shown in fig. 6 can be used to represent the actual neural network weight matrix, thereby performing the corresponding calculation operation. For example, when the number m of rows of a neural network weight matrix is much smaller than the number n of columns (e.g., n is 128, m is 32), the number of driving circuits for driving the first word line and the second word line can be saved by using the neural network circuit 10 shown in fig. 6, so that the circuit resource consumption can be saved.
For another example, as shown in FIG. 7, the neural network circuit 10 also includes an m row by n column circuit of neurosynaptic units 100. The neural network circuit 10 shown in fig. 7 differs from the neural network circuit 10 shown in fig. 5 in that: in fig. 7, only one word line is disposed for each column of neurosynaptic unit circuits 100, and the word line is electrically connected to the first switch circuit 112 and the second switch circuit 122 in the column of neurosynaptic unit circuits 100 (i.e., connected to the gates of the first transistor M1 and the second transistor M2), so that the first transistor M1 and the second transistor M2 can be supplied with a turn-on voltage. In the neural network circuit 10 shown in fig. 7, only one word line is correspondingly arranged in each column of the neurosynaptic unit circuit 100, so that the wiring resources can be saved.
For another example, as shown in FIG. 8, the neural network circuit 10 also includes an m row by n column circuit of neurosynaptic units 100. The neural network circuit 10 shown in fig. 8 differs from the neural network circuit 10 shown in fig. 6 in that: in fig. 8, each row of neurosynaptic cell circuits 100 corresponds to only one word line, and the word line is electrically connected to the first switch circuit 112 and the second switch circuit 122 in the row of neurosynaptic cell circuits 100 (i.e., connected to the gates of the first transistor M1 and the second transistor M2), so that the turn-on voltage can be provided to the first transistor M1 and the second transistor M2. In the neural network circuit 10 shown in fig. 8, only one word line is correspondingly arranged in each row of the neurosynaptic unit circuits 100, so that the wiring resources can be saved.
The neural network circuit 10 provided by the embodiment of the present disclosure is more easily mapped to an actual neural network weight matrix, thereby completing a corresponding calculation operation; in addition, the neural network circuit 10 does not need to be additionally provided with an adder circuit, so that the consumption of circuit resources can be saved.
At least one embodiment of the present disclosure also provides an information processing system 1, as shown in fig. 9, the information processing system 1 including a neural network circuit 10, a control circuit 20, a drive circuit 30, and an output circuit 40. For example, the neural network circuit 10 may employ any of the neural network circuits 10 provided by the embodiments of the present disclosure.
For example, the control circuit 20 is configured to transmit a control signal to the drive circuit 30 in accordance with input data to be processed when performing a calculation operation.
For example, when the neural network circuit 10 performs a calculation operation, the data to be processed is generally a digital voltage represented as bit data, the control circuit 20 may transmit a control signal to the driving circuit 30 according to the data to be processed, and then the driving circuit 30 supplies the driving voltage to the neural network circuit 10 according to the received control signal. For example, the driving voltage supplied to the neural network circuit 10 by the driving circuit 30 is an analog voltage. For example, in the case where the neural network circuit 10 includes m rows of neurosynaptic unit circuits 100, the data to be processed may be 1 × m of one-dimensional matrix data, m being an integer greater than 1. It should be noted that, the relationship between the data to be processed and the analog voltage provided to the neural network circuit 10 will be described below, and will not be described herein again.
In addition, when the neural network circuit 10 performs the initialization operation, the conductance values of the neurosynaptic unit circuits 100 in the neural network circuit 10 need to be initialized, and the control circuit 20 needs to control the driving circuit 30 to provide the corresponding driving voltages, so that the neurosynaptic unit circuits 100 complete the set operation or/and the reset operation. For example, in some embodiments, the control circuit 20 may be implemented as a controller.
It should be noted that, in the embodiment of the present disclosure, the control circuit 20 may also be implemented as a processor. For example, the processor may include various computing architectures such as a Complex Instruction Set Computer (CISC) architecture, a Reduced Instruction Set Computer (RISC) architecture, or an architecture that implements a combination of instruction sets. In some embodiments, the processor may also be a microprocessor, such as an X86 processor or an ARM processor, or may be a Digital Signal Processor (DSP), or the like.
For example, the drive circuit 30 is configured to provide a drive voltage to the neural network circuit 10. For example, in the example of fig. 10 and 11, the drive circuit 30 includes a bit line drive circuit 301, a word line drive circuit 302, and a source line drive circuit 303. For example, bit line driver circuit 301 is connected to first and second bit lines in neural network circuit 10 to provide respective first and second bit line voltages; the word line driving circuit 302 is connected with a first word line and a second word line in the neural network circuit 10 to provide corresponding control voltages for turning on or off the transistors; the source line driver circuit 303 is connected to source lines in the neural network circuit 10 to supply corresponding source line voltages. For example, in some examples, the bit line driver circuit 301, the word line driver circuit 302, and the source line driver circuit 303 may all be implemented as drivers.
For example, the output circuit 40 is configured to process the output result of the neural network circuit 10. For example, as in the examples of fig. 10 and 11, the output circuit 40 includes a sample-and-hold circuit S & H and an analog-to-digital conversion circuit ADC. For example, in one example, the sample-and-hold circuit S & H is configured to collect an analog current output by the neural network circuit 10, and the analog-to-digital conversion circuit ADC is configured to convert the analog current to a digital current. For another example, in another example, the sample-and-hold circuit S & H is configured to collect an analog current output by the neural network circuit 10 and convert the analog current to an analog voltage, and the analog-to-digital conversion circuit ADC is configured to convert the analog voltage to a digital voltage.
The information processing system 1 provided by the embodiment of the present disclosure can implement parallel computation, thereby increasing the computation speed and reducing the power consumption. The information processing system 1 can be applied to a multilayer neural network, an adaptive resonance network, and a convolutional neural network, and weight adjustment is performed by the neural network circuit 10. Thereby completing the corresponding calculation operation.
Fig. 10 and 11 are two examples of the information processing system 1 shown in fig. 9, in which the neural network circuit 10 in the information processing system 1 shown in fig. 10 corresponds to the neural network circuit 10 shown in fig. 5, and the neural network circuit 10 in the information processing system 1 shown in fig. 11 corresponds to the neural network circuit 10 shown in fig. 6. The following describes the operation principle of the information processing system 1 and the neural network circuit 10 provided by the embodiment of the present disclosure, taking the information processing system 1 shown in fig. 10 as an example. Note that the transistors in fig. 10 are all described using N-type transistors as an example.
For example, as shown in FIG. 10, a neural network circuit 10 includes an m row by n column circuit of neurosynaptic units 100, assuming that the nerve isThe network circuit 10 has already undergone an initialization operation, the neural network circuit 10 may represent a weight matrix Wm*n. For example, the data to be processed is a 1 × m one-dimensional matrix data I1*mThe one-dimensional matrix data I1*mAnd a weight matrix Wm*nThe cross multiplication is carried out to obtain one-dimensional matrix data O of 1 x n1*nI.e. by
When performing a calculation operation, the control circuit 20 needs to process the data I according to the data I to be processed1*mA control signal is sent to the driver circuit 30 to control the driver circuit 30 to provide a corresponding analog voltage to the neural network circuit 10. For example, in one example, the data I to be processed1*mIs 1 bit of bit data so that an analog voltage of two states can be represented, for example, the bit data is 0]Time indicates no voltage applied and bit data is [1 ]]Time represents an applied voltage; when the bit data is 1, a first bit line voltage required to be supplied to the first bit line is Vref + Vread, a second bit line voltage required to be supplied to the second bit line is Vref-Vread, Vref denotes a source line voltage supplied to a source line, and Vread denotes a read voltage. For example, as shown in table 1, in one example, when the source line voltage Vref is 2.5V and the read voltage Vread is 0.2V, the first bit line voltage is required to be provided at 2.7V and the second bit line voltage is required to be provided at 2.3V.
TABLE 1
| Setting | Reduction of position | Computing | |
| First word line voltage | 3.5V | 5V | 5V |
| Second word line voltage | GND | GND | 5V |
| First bit line voltage | 2.5V | GND | 2.7V(Vref+Vread) |
| Second bit line voltage | GND | 2.5V | 2.3V(Vref-Vread) |
| Source line voltage | GND | 2.5V | 2.5V(Vref) |
As another example, in another example, the data I to be processed1*mIs 2 bits of bit data, so that it can represent analog voltages of four states, for example, the bit data is [00 ]]Time indicates no voltage applied and bit data is [01 ]]The read voltage Vread corresponding to the time is 0.1V, and the bit data is [10 ]]The read voltage Vread corresponding to the time is 0.2V, and the bit data is [11]When the corresponding read voltage Vread is 0.3V, the required first bit line voltage and second bit line voltage can be obtained by combining the source line voltage Vref.
For example, in the example shown in Table 1, in the data I to be processed1*mIs 1 bit, when performing a calculation operation, for example, the data I to be processed1*mIs 1, a voltage needs to be applied to the neurosynaptic unit circuit 100 of the first row. Specifically, the control circuit 20 controls the bit line driving circuit 301 to the first bit line BLp of the first row<1>Providing a first bit line voltage of 2.7V and a second bit line BLn to the first row<1>Providing a second bit line voltage of 2.3V; the control circuit 20 controls the word line driving circuit 302 to supply the turn-on voltage of 5V to the first word lines and the second word lines of all the columns; and the control circuit 20 controls the source line driving circuit 303 to supply a source line voltage of 2.5V to the source lines of all columns.
The above description is directed to the data I to be processed1*mOne of the elements is illustrated, that is, the control circuit 20 controls the driving circuit 30 to provide the corresponding analog voltage to the neurosynaptic unit circuit 100 in one row of the neural network circuit 10 according to the element, and the same processing method may be adopted for the analog voltages required by the neurosynaptic unit circuits 100 in other rows, which is not described herein again. Note that, the data I to be processed1*mMay also be data having more bits, as embodiments of the present disclosure are not limited in this respect.
After the analog voltage applied to the neural network circuit 10 is calculated by the neural network circuit 10, an output result can be obtained from the source line. For example, the output circuit 40 may process the output result, and as described above, in one example, the sample-and-hold circuit S & H may be configured to capture an analog current output by the neural network circuit 10, e.g., the sample-and-hold circuit S & H is connected to a source line, thereby capturing an analog current on the source line. The analog-to-digital conversion circuit ADC is connected to the sample-and-hold circuit S & H, and can convert the analog current output by the sample-and-hold circuit S & H into a digital current. For another example, in another example, the sample-and-hold circuit may be configured to collect an analog current output by the neural network circuit 10 and convert the analog current to an analog voltage, e.g., the sample-and-hold circuit may include an integrator or a sampling resistor; accordingly, the analog-to-digital conversion circuit ADC is configured to convert the analog voltage into a digital voltage.
After being processed by the output circuit 40, one-dimensional matrix data I can be obtained1*mAnd a weight matrix Wm*nThe result of the cross multiplication, i.e. one-dimensional matrix data O1*n. For example, Data corresponding to the output result of the output circuit 40 of the first column of neurosynaptic unit circuits 100<1>Is O1*nCorresponds to the output result Data of the output circuit 40 of the n-th column of neurosynaptic unit circuits 100<n>Is O1*nThe nth element of (1).
In the above calculation operation, for each of the neurosynaptic unit circuits 100, the voltage applied to the first pole of the first memristor R1 is the first bit line voltage Vref + Vread, the voltage applied to the first pole of the second memristor R2 is the second bit line voltage Vref-Vread, and the voltage applied to the source line terminals (i.e., the second pole of the first memristor R1 and the second pole of the second memristor R2) is the source line voltage Vref, so the voltage difference applied to the two ends of the first memristor R1 and the second memristor R2 is Vread and Vread, respectively, and the current Icell detected through the source line is I1-I2 Vread (G1-G2). I1 represents the current flowing through the first memristor R1, I2 represents the current flowing through the second memristor R2, G1 represents the conductance value of the first memristor R1, and G2 represents the conductance value of the second memristor R2.
As described above, when the neural network circuit 10 represents a neural network weight matrix, each of the neurosynaptic unit circuits 100 may represent a positive and negative weight value in the weight matrix, the conductance value G1 of the first memristor R1 in the neurosynaptic unit circuit 100 may represent a positive weight portion of the weight value, and the conductance value G2 of the second memristor R2 in the neurosynaptic unit circuit 100 may represent a negative weight portion of the weight value.
Before the calculation operation, the neural network circuit 10 needs to be initialized according to the preset weight matrix, so that the neural network circuit 10 can represent the preset weight matrix. The initialization operation of the neural network circuit 10 is explained below with reference to fig. 10 and table 1.
For example, initializing the neural network circuit 10 according to the preset weight matrix includes: setting and/or resetting each neurosynaptic cell circuit 100 in the neural network circuit 10; and detecting whether the conductance value of each neurosynaptic unit circuit 100 in the neural network circuit 10 is the same as the corresponding weight value in the preset weight matrix.
For example, for the neurosynaptic cell circuit 100 of the first row and the first column shown in fig. 10, for example, the first memristor R1 in the neurosynaptic cell circuit 100 may be initialized first. In the set operation, a first word line voltage is applied through the first word line WLp <1> such that the first transistor M1 is turned on, for example, the first word line voltage may be 3.5V. Meanwhile, a set voltage (first bit line voltage) is applied through the first bit line BLp <1> to set the first memristor R1, which may be 2.5V, for example. The conductance value of the first memristor R1 may be increased by the set operation.
In addition, when the reset operation is performed, a first word line voltage is applied through the first word line WLp <1> such that the first transistor M1 is turned on, for example, the first word line voltage may be 5V. At the same time, a reset voltage (source line voltage), which may be 2.5V for example, is applied through the source line SL <1> to reset the first memristor R1. The conductance value of the first memristor R1 may be reduced by the reset operation.
The following points are illustrated for the voltage example shown in table 1. First, "GND" shown in table 1 represents ground, i.e., the corresponding voltage value is 0V. Second, the first word line voltage is 3.5V when a set operation is performed, and the first word line voltage is 5V when a reset operation is performed; as described above, the set operation can increase the conductance value of the first memristor R1, i.e., decrease the resistance value of the first memristor R1, and under the condition that the voltage across the first memristor R1 is not changed, as the resistance value of the first memristor R1 is smaller and smaller, the current flowing through the first memristor R1 is larger and larger, so in order to prevent the current from being too large to cause the over-operation problem, the first word line voltage is made smaller when the set operation is performed relative to the reset operation. Third, in the reset operation, since the second word line voltage is zero, the second transistor M2 is turned off, and the second bit line voltage (e.g., 2.5V, which is the same as the source line voltage) is still applied through the second bit line BLn <1>, so that the leakage through the second transistor M2 can be prevented. Of course, the second bit line voltage may also be made zero at this time, which is not limited by the embodiment of the disclosure.
For example, after the initialization operation for the first memristor R1 in the first row and the first column of the neurosynaptic unit circuit 100 is completed, the initialization operation for the second memristor R2 in the neurosynaptic unit circuit 100 may be performed in the same manner. For example, after the initialization operation is completed on the first row and the first column of the neurosynaptic unit circuits 100, the initialization operation may be sequentially completed on other neurosynaptic unit circuits 100 in the neural network circuit 10 according to the same method, which is not described herein again.
In the embodiment of the present disclosure, the first bit line voltage, the second bit line voltage, and the source line voltage applied by the driving circuit 40 may adopt a pulse voltage, for example, a pulse width of the pulse voltage may include 20ns to 100ns, for example, a pulse width may adopt 50ns, when the set operation or the reset operation is performed.
For example, after each set operation or reset operation, it may be detected whether the conductance value of the neurosynaptic unit circuit 100 is the same as the corresponding weight value in the preset weight matrix. In detecting the conductance values (G1-G2) of the neurosynaptic unit circuit 100, it can be calculated from the first bit line voltage applied to the first bit line, the second bit line voltage applied to the second bit line, and the current detected from the source line.
If the conductance value of the detecting neurosynaptic unit circuit 100 is the same as the corresponding weight value in the preset weight matrix, the initialization operation is ended; if the conductance value of the neurosynaptic unit circuit 100 and the corresponding weight value in the preset weight matrix are different, the setting and/or resetting operation is continued until the conductance value of the neurosynaptic unit circuit 100 and the corresponding weight value in the preset weight matrix are the same, and the initialization operation is ended.
It should be noted that, in the embodiment of the present disclosure, the conductance value of the neurosynaptic unit circuit 100 is the same as the corresponding weight value in the preset weight matrix, which means that the error between the conductance value and the corresponding weight value in the preset weight matrix satisfies the preset condition, and the two are not necessarily required to be completely the same.
The operation principle of the information processing system 1 shown in fig. 11 can refer to the corresponding description of the operation principle of the information processing system 1 shown in fig. 10, and will not be described herein again.
The information processing system 1 provided by the embodiment of the present disclosure can implement parallel computation, thereby increasing the computation speed and reducing the power consumption. The information processing system 1 can be applied to a multilayer neural network, an adaptive resonance network, and a convolutional neural network, and weight adjustment is performed by the neural network circuit 10. Thereby completing the corresponding calculation operation.
At least one embodiment of the present disclosure also provides an information processing method that can be used for the information processing system 1 provided by an embodiment of the present disclosure, as shown in fig. 12, the method including the following operations.
Step S20: sending a control signal to the driving circuit 30 according to the data to be processed;
step S30: supplying a driving voltage to the neural network circuit 10 according to the control signal; and
step S40: the output results of the neural network circuit 10 are processed.
In another embodiment of the present disclosure, as shown in fig. 12, the information processing method may further include step S10: the neural network circuit 10 is initialized according to a preset weight matrix.
For example, in one example, as shown in fig. 13, the step S10 may include the following operations.
Step S110: setting and/or resetting each neurosynaptic cell circuit in the neural network circuit 10; and
step S120: it is detected whether the conductance value of each neurosynaptic unit circuit 100 in the neural network circuit 10 is the same as the corresponding weight value in the preset weight matrix.
For example, in another example, as shown in fig. 13, the step S10 may further include the step S130: and if the conductance value of each neurosynaptic unit circuit in the neural network circuit is different from the corresponding weight value in the preset weight matrix, continuing to carry out setting and/or resetting operation until the conductance value of each neurosynaptic unit circuit in the neural network circuit is the same as the corresponding weight value in the preset weight matrix.
It should be noted that, for detailed description and technical effects of the information processing method provided by the embodiment of the present disclosure, reference may be made to description of the working principle of the information processing system 1 in the embodiment of the present disclosure, and details are not described here.
The above is only a specific embodiment of the present disclosure, but the scope of the present disclosure is not limited thereto, and the scope of the present disclosure should be determined by the scope of the claims.
Claims (17)
1. A neurosynaptic unit circuit comprising a first weight circuit and a second weight circuit; wherein the content of the first and second substances,
the first weight circuit comprises a first resistance changing circuit and a first switch circuit, the first resistance changing circuit is electrically connected with a first bit line end and the first switch circuit, and the first switch circuit is electrically connected with a first word line end, the first resistance changing circuit and a source line end;
the second weight circuit comprises a second resistance changing circuit and a second switch circuit, the second resistance changing circuit is electrically connected with the second bit line end and the second switch circuit, and the second switch circuit is electrically connected with the second word line end, the second resistance changing circuit and the source line end.
2. The neurosynaptic unit circuit of claim 1, wherein,
the neurosynaptic unit is configured to, when performing a computing operation, apply a first bit line voltage to the first bit line terminal that is greater than a source line voltage applied to the source line terminal, and apply a second bit line voltage to the second bit line terminal that is less than the source line voltage.
3. The neurosynaptic unit circuit of claim 1 or 2, wherein,
the first resistive switching circuit comprises a first memristor, the first switching circuit comprises a first transistor, a first pole of the first memristor is connected with the first bit line end, a second pole of the first memristor is connected with the first pole of the first transistor, a gate of the first transistor is connected with the first word line end, and the second pole of the first transistor is connected with the source line end;
the second resistive switching circuit comprises a second memristor, the second switch circuit comprises a second transistor, a first pole of the second memristor is connected with the second bit line end, a second pole of the second memristor is connected with the first pole of the second transistor, a grid electrode of the second transistor is connected with the second word line end, and the second pole of the second transistor is connected with the source line end.
4. The neurosynaptic unit circuit of claim 3, wherein the first memristor and the second memristor comprise resistance graded devices.
5. A neural network circuit comprising a plurality of the neurosynaptic unit circuits of any one of claims 1-4 arranged in an array comprising a plurality of rows and a plurality of columns.
6. The neural network circuit of claim 5,
each row of the neurosynaptic unit circuits is correspondingly provided with a first bit line and a second bit line, the first bit line is electrically connected with the first resistance changing circuit in the corresponding row of the neurosynaptic unit circuits, and the second bit line is electrically connected with the second resistance changing circuit in the corresponding row of the neurosynaptic unit circuits;
each row of the neurosynaptic unit circuits is correspondingly provided with a first word line, a second word line and a source line, the first word line is electrically connected with the first switch circuit in the corresponding row of the neurosynaptic unit circuits, the second word line is electrically connected with the second switch circuit in the corresponding row of the neurosynaptic unit circuits, and the source line is electrically connected with the source line in the corresponding row of the neurosynaptic unit circuits.
7. The neural network circuit of claim 5,
each row of the neurosynaptic unit circuits is correspondingly provided with a first bit line, a second bit line, a first word line and a second word line, the first bit line is electrically connected with the first resistance change circuit in the corresponding row of the neurosynaptic unit circuits, the second bit line is electrically connected with the second resistance change circuit in the corresponding row of the neurosynaptic unit circuits, the first word line is electrically connected with the first switch circuit in the corresponding row of the neurosynaptic unit circuits, and the second word line is electrically connected with the second switch circuit in the corresponding row of the neurosynaptic unit circuits;
and each column of the neurosynaptic unit circuits is correspondingly provided with a source line, and the source line is electrically connected with the source line terminal in the corresponding column of the neurosynaptic unit circuits.
8. The neural network circuit of claim 5,
each row of the neurosynaptic unit circuits is correspondingly provided with a first bit line and a second bit line, the first bit line is electrically connected with the first resistance changing circuit in the corresponding row of the neurosynaptic unit circuits, and the second bit line is electrically connected with the second resistance changing circuit in the corresponding row of the neurosynaptic unit circuits;
each row of the neurosynaptic unit circuits is correspondingly provided with a word line and a source line, the word line is electrically connected with the first switch circuit and the second switch circuit in the corresponding row of the neurosynaptic unit circuits, and the source line is electrically connected with the source line in the corresponding row of the neurosynaptic unit circuits.
9. The neural network circuit of claim 5,
each row of the neurosynaptic unit circuits is correspondingly provided with a first bit line, a second bit line and a word line, the first bit line is electrically connected with the first resistance change circuit in the corresponding row of the neurosynaptic unit circuits, the second bit line is electrically connected with the second resistance change circuit in the corresponding row of the neurosynaptic unit circuits, and the word line is electrically connected with the first switch circuit and the second switch circuit in the corresponding row of the neurosynaptic unit circuits;
and each column of the neurosynaptic unit circuits is correspondingly provided with a source line, and the source line is electrically connected with the source line terminal in the corresponding column of the neurosynaptic unit circuits.
10. An information processing system comprising the neural network circuit of any one of claims 5-9, a control circuit, a drive circuit, and an output circuit; wherein the content of the first and second substances,
the control circuit is configured to send a control signal to the drive circuit according to input data to be processed when calculation operation is performed;
the drive circuit is configured to provide a drive voltage to the neural network circuit in accordance with the control signal; and
the output circuit is configured to process an output result of the neural network circuit.
11. The information processing system of claim 10, wherein, in a case where the neural network circuit includes m rows of neurosynaptic unit circuits, the data to be processed includes 1 × m of one-dimensional matrix data, m being an integer greater than 1.
12. The information processing system according to claim 10 or 11, wherein the output circuit includes a sample-and-hold circuit and an analog-to-digital conversion circuit;
the sample-and-hold circuit is configured to collect an analog current output by the neural network circuit, and the analog-to-digital conversion circuit is configured to convert the analog current to a digital current.
13. The information processing system according to claim 10 or 11, wherein the output circuit includes a sample-and-hold circuit and an analog-to-digital conversion circuit;
the sample-and-hold circuit is configured to collect an analog current output by the neural network circuit and convert the analog current to an analog voltage, and the analog-to-digital conversion circuit is configured to convert the analog voltage to a digital voltage.
14. An information processing method of the information processing system according to any one of claims 10 to 13, comprising:
sending a control signal to the driving circuit according to the data to be processed;
providing a driving voltage to the neural network circuit according to the control signal; and
and processing the output result of the neural network circuit.
15. The information processing method according to claim 14, further comprising:
and initializing the neural network circuit according to a preset weight matrix.
16. The information processing method of claim 15, wherein the initializing the neural network circuit according to a preset weight matrix comprises:
setting and/or resetting each neurosynaptic unit circuit in the neural network circuit; and
detecting whether the conductance value of each neurosynaptic unit circuit in the neural network circuit is the same as the corresponding weight value in the preset weight matrix.
17. The information processing method according to claim 16, further comprising:
if the conductance value of each neurosynaptic unit circuit in the neural network circuit is different from the corresponding weight value in the preset weight matrix, continuing the setting and/or resetting operation until the conductance value of each neurosynaptic unit circuit in the neural network circuit is the same as the corresponding weight value in the preset weight matrix.
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK40001180A true HK40001180A (en) | 2020-02-21 |
| HK40001180B HK40001180B (en) | 2021-08-20 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN108921290B (en) | Neural synaptic unit circuit, neural network circuit and information processing system | |
| US12283320B2 (en) | Data processing method based on memristor array and electronic apparatus | |
| Ielmini et al. | Device and circuit architectures for in‐memory computing | |
| US11727977B2 (en) | Non-volatile analog resistive memory cells implementing ferroelectric select transistors | |
| CN113222128B (en) | 2T1R array based on memristor and parallel operation method and algorithm thereof | |
| KR20190005101A (en) | A MULTI-BIT, SoC-COMPATIBLE NEUROMORPHIC WEIGHT CELL USING FERROELECTRIC FETS | |
| WO2011133139A1 (en) | Refreshing memristive systems | |
| US11043265B2 (en) | Memory devices with volatile and non-volatile behavior | |
| CN115862708A (en) | Memristor array operating method, data processing device | |
| DE102021115236B4 (en) | SIGNAL PRESERVATION IN MRAM DURING READ | |
| US20210012974A1 (en) | Fully-printed all-solid-state organic flexible artificial synapse for neuromorphic computing | |
| KR20190058254A (en) | Bi-directional weight cell | |
| WO2023130487A1 (en) | Data processing method based on memristor array, and electronic apparatus | |
| JP2022539751A (en) | neural network memory | |
| TWI842064B (en) | Bias temperature instability correction in memory arrays | |
| KR20210023277A (en) | Integrate-and-fire neuron circuit using single-gated feedback field-effect transistor | |
| US10424378B2 (en) | Memristive control circuits with current control components | |
| CN113592081B (en) | Data processing device and data processing method | |
| CN113222131B (en) | A Synaptic Array Circuit with Signed Weight Coefficient Based on 1T1R | |
| DE102021115377B4 (en) | SIGNAL AMPLIFICATION IN MRAM DURING READ | |
| CN114121089A (en) | Data processing method and device based on memristor array | |
| HK40001180A (en) | Neurosynaptic unit circuit, neural network circuit and information processing system | |
| HK40001180B (en) | Neurosynaptic unit circuit, neural network circuit and information processing system | |
| US11328772B2 (en) | Reducing current in crossbar array circuits with large output resistance | |
| TW201923553A (en) | Neuromorphic computing device |