Choi et al., 2018 - Google Patents
Content addressable memory based binarized neural network accelerator using time-domain signal processingChoi et al., 2018
- Document ID
- 11889892835725128755
- Author
- Choi W
- Jeong K
- Choi K
- Lee K
- Park J
- Publication year
- Publication venue
- Proceedings of the 55th Annual Design Automation Conference
External Links
Snippet
Binarized neural network (BNN) is one of the most promising solution for low-cost convolutional neural network acceleration. Since BNN is based on binarized bit-level operations, there exist great opportunities to reduce power-hungry data transfers and …
- 230000001537 neural 0 title abstract description 15
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11C—STATIC STORES
- G11C11/00—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor
- G11C11/21—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using electric elements
- G11C11/34—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using electric elements using semiconductor devices
- G11C11/40—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using electric elements using semiconductor devices using transistors
- G11C11/41—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using electric elements using semiconductor devices using transistors forming static cells with positive feedback, i.e. cells not needing refreshing or charge regeneration, e.g. bistable multivibrator or Schmitt trigger
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
- G06N3/04—Architectures, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/50—Computer-aided design
- G06F17/5009—Computer-aided design using simulation
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11C—STATIC STORES
- G11C11/00—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor
- G11C11/56—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using storage elements with more than two stable states represented by steps, e.g. of voltage, current, phase, frequency
- G11C11/5621—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using storage elements with more than two stable states represented by steps, e.g. of voltage, current, phase, frequency using charge storage in a floating gate
- G11C11/5642—Sensing or reading circuits; Data output circuits
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11C—STATIC STORES
- G11C11/00—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor
- G11C11/56—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using storage elements with more than two stable states represented by steps, e.g. of voltage, current, phase, frequency
- G11C11/565—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using storage elements with more than two stable states represented by steps, e.g. of voltage, current, phase, frequency using capacitive charge storage elements
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11C—STATIC STORES
- G11C15/00—Digital stores in which information comprising one or more characteristic parts is written into the store and in which information is read-out by searching for one or more of these characteristic parts, i.e. associative or content-addressed stores
- G11C15/04—Digital stores in which information comprising one or more characteristic parts is written into the store and in which information is read-out by searching for one or more of these characteristic parts, i.e. associative or content-addressed stores using semiconductor elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
- G06N99/005—Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F7/00—Methods or arrangements for processing data by operating upon the order or content of the data handled
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6267—Classification techniques
- G06K9/6268—Classification techniques relating to the classification paradigm, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11C—STATIC STORES
- G11C2211/00—Indexing scheme relating to digital stores characterized by the use of particular electric or magnetic storage elements; Storage elements therefor
- G11C2211/56—Indexing scheme relating to G11C11/56 and sub-groups for features not covered by these groups
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Choi et al. | Content addressable memory based binarized neural network accelerator using time-domain signal processing | |
| Agrawal et al. | Xcel-RAM: Accelerating binary neural networks in high-throughput SRAM compute arrays | |
| Jhang et al. | Challenges and trends of SRAM-based computing-in-memory for AI edge devices | |
| Pedretti et al. | Tree-based machine learning performed in-memory with memristive analog CAM | |
| US9697877B2 (en) | Compute memory | |
| US10360971B1 (en) | Artificial neural network functionality within dynamic random-access memory | |
| US11205476B1 (en) | Read data processing circuits and methods associated with computational memory cells | |
| Lin et al. | A review on SRAM-based computing in-memory: Circuits, functions, and applications | |
| US11966714B2 (en) | Ternary in-memory accelerator | |
| EP3506084B1 (en) | System and method for tunable precision of dot-product engine | |
| CN112819147B (en) | Memory-based neuromorphic device | |
| CN112581996A (en) | Time domain memory computing array structure based on magnetic random access memory | |
| US11817173B2 (en) | Timing-based computer architecture systems and methods | |
| US20250131259A1 (en) | Mtj-based hardware synapse implementation for binary and ternary deep neural networks | |
| Bose et al. | A 51.3-TOPS/W, 134.4-GOPS in-memory binary image filtering in 65-nm CMOS | |
| US20230031841A1 (en) | Folding column adder architecture for digital compute in memory | |
| Rybalkin et al. | Efficient hardware architectures for 1D-and MD-LSTM networks | |
| CN110941185A (en) | Double-word line 6TSRAM unit circuit for binary neural network | |
| Kang et al. | Deep in-memory architectures for machine learning | |
| Jiang et al. | Compute-in-Memory Architecture | |
| Yu et al. | A Dual 7T SRAM-Based Zero-Skipping Compute-In-Memory Macro With 1-6b Binary Searching ADCs for Processing Quantized Neural Networks | |
| Tchendjou et al. | Spintronic memristor-based binarized ensemble convolutional neural network architectures | |
| Chiang et al. | A 14 μj/decision keyword-spotting accelerator with in-sramcomputing and on-chip learning for customization | |
| CN115658010A (en) | Pulse width modulation circuit, quantization circuit, storage circuit and chip | |
| Agrawal | Compute-in-memory primitives for energy-efficient machine learning |