DGMs for NLP. A roadmap.
-
Updated
Dec 12, 2022
DGMs for NLP. A roadmap.
Leveraging Recursive Gumbel-Max Trick for Approximate Inference in Combinatorial Spaces, NeurIPS 2021
Zeroth-Order Regularized Optimization (ZORO): Approximately Sparse Gradients and Adaptive Sampling
An official repository for a VAE tutorial of Probabilistic Modelling and Reasoning (2023/2024) - a University of Edinburgh master's course.
[ICML 2021] "Progressive-Scale Boundary Blackbox Attack via Projective Gradient Estimation" by Jiawei Zhang*, Linyi Li*, Huichen Li, Xiaolu Zhang, Shuang Yang, Bo Li
SCOBO: Sparsity-aware Comparison Oracle Based Optimization
On the Hardness of Probabilistic Neurosymbolic Learning (ICML2024)
Julia code for running the numerical experiment in Subsection 4.6.2 of Brian Irwin's PhD thesis.
Add a description, image, and links to the gradient-estimation topic page so that developers can more easily learn about it.
To associate your repository with the gradient-estimation topic, visit your repo's landing page and select "manage topics."