A framework for large scale recommendation algorithms.
-
Updated
Nov 15, 2024 - Python
A framework for large scale recommendation algorithms.
[NeurIPS 2023] Michelangelo: Conditional 3D Shape Generation based on Shape-Image-Text Aligned Latent Representation
InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.
Repository for Project Insight: NLP as a Service
A compilation of the best multi-agent papers
Federated Learning Utilities and Tools for Experimentation
CLIP (Contrastive Language–Image Pre-training) for Italian
The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for forecasting task v1.0 version.
I will implement Fastai in each projects present in this repository.
Retrieval-based Voice Conversion (RVC) implemented with Hugging Face Transformers.
Pytorch implementation of image captioning using transformer-based model.
[TMI 2023] XBound-Former: Toward Cross-scale Boundary Modeling in Transformers
This project investigates the security of large language models by performing binary classification of a set of input prompts to discover malicious prompts. Several approaches have been analyzed using classical ML algorithms, a trained LLM model, and a fine-tuned LLM model.
Official repository for the paper "ALERT: A Comprehensive Benchmark for Assessing Large Language Models’ Safety through Red Teaming"
Image Captioning Vision Transformers (ViTs) are transformer models that generate descriptive captions for images by combining the power of Transformers and computer vision. It leverages state-of-the-art pre-trained ViT models and employs technique
This repository contain my 75Day Hard Generative AI and LLM Learning Challenge.
A radically simple, reliable, and high performance template to enable you to quickly get set up building multi-agent applications
An ASR (Automatic Speech Recognition) adversarial attack repository.
Symbolic music generation taking inspiration from NLP and human composition process
Neural Persian Poet: A sequence-to-sequence model for composing Persian poetry
Add a description, image, and links to the transformers-models topic page so that developers can more easily learn about it.
To associate your repository with the transformers-models topic, visit your repo's landing page and select "manage topics."