Code and datasets for the Tsetlin Machine
-
Updated
Oct 16, 2024 - Cython
Code and datasets for the Tsetlin Machine
Implements the Tsetlin Machine, Convolutional Tsetlin Machine, Regression Tsetlin Machine, Weighted Tsetlin Machine, and Embedding Tsetlin Machine, with support for continuous features, multigranularity, clause indexing, and literal budget
Contextual Bandits in R - simulation and evaluation of Multi-Armed Bandit Policies
A checkers reinforcement learning AI, and all the tools needed to train it.
Tutorial on the Convolutional Tsetlin Machine
Multi-threaded implementation of the Tsetlin Machine, Convolutional Tsetlin Machine, Regression Tsetlin Machine, and Weighted Tsetlin Machine, with support for continuous features and multigranularity.
Contextual bandit algorithm called LinUCB / Linear Upper Confidence Bounds as proposed by Li, Langford and Schapire
Privacy-Preserving Bandits (MLSys'20)
Client that handles the administration of StreamingBandit online, or straight from your desktop. Setup and run streaming (contextual) bandit experiments in your browser.
Some visualizations of bandit algorithm outputs.
Bandit learning on top of Neural Monkey, an open-source tool for sequence learning in NLP built on TensorFlow. Bandit online learning objectives in branch bandits-acl (ACL17) and counterfactual learning objectives in branch acl-2018 (ACL18).
Simple Implementations of Bandit Algorithms in python
🦊 A series of bandit algorithms in Swift
Based on Gentile-Li-Zapella article "Online Clustering of Bandits"
Bayesian bandits in Python3.
Detailed solution of solving wargames of over the wire which includes bandit and in future many more.
Decoder, aligner, and model optimizer for statistical machine translation and other structured prediction models based on (mostly) context-free formalisms
Implementing RL algorithms
A policy gradient approach to a multi-armed bandit problem
Add a description, image, and links to the bandit-learning topic page so that developers can more easily learn about it.
To associate your repository with the bandit-learning topic, visit your repo's landing page and select "manage topics."