🔬 Nano size Theano LSTM module
-
Updated
Nov 16, 2016 - Python
🔬 Nano size Theano LSTM module
A tour of different optimization algorithms in PyTorch.
Visualization of various deep learning optimization algorithms using PyTorch automatic differentiation and optimizers.
[Python] [arXiv/cs] Paper "An Overview of Gradient Descent Optimization Algorithms" by Sebastian Ruder
Hands on implementation of gradient descent based optimizers in raw python
Neural Networks and optimizers from scratch in NumPy, featuring newer optimizers such as DemonAdam or QHAdam.
Classification of data using neural networks — with back propagation (multilayer perceptron) and with counter propagation
gradient descent optimization algorithms
Using Densenet for image classification in PyTorch
"Simulations for the paper 'A Review Article On Gradient Descent Optimization Algorithms' by Sebastian Roeder"
Deep Learning Optimizers
Applied LSTM algorithm on Amazon Fine Food Review Dataset
Clean & dependency-free implementation of the ADADELTA algorithm in python.
Experimenting with MNIST using the MXNet machine learning framework
A deep learning classification program to detect the CT-scan results using python
Coursework on global optimization methods (BGD, Adadelta)
Machine learning algorithm implemented from scratch in python
Using different optimizers for a comparison study, finding the root of differences by visualization and to find the best case for a specific task
Data Structures, Algorithms and Machine Learning Optimization
Add a description, image, and links to the adadelta topic page so that developers can more easily learn about it.
To associate your repository with the adadelta topic, visit your repo's landing page and select "manage topics."