Top Python Transformers repositories on GitHub
Transformer model implementations, training kits, and finetuning tooling. Filtered to projects whose primary language is Python.
Ranked by stars across 781 Python repositories tagged transformers. Refreshed daily.
- 1hiyouga/LlamaFactory★ 70,990 · ⑂ 8,673
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
- fine-tuning
- llama
- llm
- peft
- transformers
- rlhf
- 2labmlai/annotated_deep_learning_paper_implementations★ 66,547 · ⑂ 6,713
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
- deep-learning
- deep-learning-tutorial
- pytorch
- gan
- transformers
- reinforcement-learning
- 3lucidrains/vit-pytorch★ 25,149 · ⑂ 3,495
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
- artificial-intelligence
- attention-mechanism
- transformers
- computer-vision
- image-classification
- 4huggingface/peft★ 21,074 · ⑂ 2,278
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
- adapter
- diffusion
- llm
- parameter-efficient-learning
- python
- pytorch
- 5stas00/ml-engineering★ 17,870 · ⑂ 1,137
Machine Learning Engineering Open Book
- pytorch
- slurm
- large-language-models
- llm
- machine-learning
- scalability
- 6arc53/DocsGPT★ 17,870 · ⑂ 2,033
Private AI platform for agents, assistants and enterprise search. Built-in Agent Builder, Deep research, Document analysis, Multi-model support, and API connectivity for agents.
- ai
- python
- natural-language-processing
- react
- chatgpt
- docsgpt
- 7NVIDIA/Megatron-LM★ 16,238 · ⑂ 3,915
Ongoing research training transformer models at scale
- large-language-models
- model-para
- transformers
- 8BlinkDL/RWKV-LM★ 14,513 · ⑂ 1,009
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN and transformer - great performance, linear time, constant space (no kv-cache), fast training, infinite ctx_len, and free sentence embedding.
- attention-mechanism
- deep-learning
- gpt
- gpt-2
- gpt-3
- language-model
- 9PaddlePaddle/PaddleNLP★ 12,937 · ⑂ 3,044
Easy-to-use and powerful LLM and SLM library with awesome model zoo.
- nlp
- embedding
- bert
- ernie
- paddlenlp
- pretrained-models
- 10neuml/txtai★ 12,471 · ⑂ 808
💡 All-in-one AI framework for semantic search, LLM orchestration and language model workflows
- python
- search
- nlp
- semantic-search
- vector-search
- txtai
- 11qubvel-org/segmentation_models.pytorch★ 11,532 · ⑂ 1,835
Semantic segmentation models with 500+ pretrained convolutional and transformer-based backbones.
- segmentation
- image-processing
- pspnet
- unet
- unet-pytorch
- pytorch
- 12speechbrain/speechbrain★ 11,517 · ⑂ 1,686
A PyTorch-based Speech Toolkit
- speech-recognition
- speech-toolkit
- speaker-recognition
- speech-to-text
- speech-enhancement
- speech-separation
- 13OpenRLHF/OpenRLHF★ 9,457 · ⑂ 936
An Easy-to-use, Scalable and High-performance Agentic RL Framework based on Ray (PPO & DAPO & REINFORCE++ & VLM & TIS & vLLM & Ray & Async RL)
- transformers
- vllm
- large-language-models
- raylib
- reinforcement-learning-from-human-feedback
- reinforcement-learning
- 14intel/ipex-llm★ 8,801 · ⑂ 1,424
Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, DeepSeek, Mixtral, Gemma, Phi, MiniCPM, Qwen-VL, MiniCPM-V, etc.) on Intel XPU (e.g., local PC with iGPU and NPU, discrete GPU such as Arc, Flex and Max); seamlessly integrate with llama.cpp, Ollama, HuggingFace, LangChain, LlamaIndex, vLLM, DeepSpeed, Axolotl, etc.
- pytorch
- llm
- transformers
- gpu
- 15EleutherAI/gpt-neo★ 8,279 · ⑂ 961
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
- language-model
- transformers
- gpt
- gpt-2
- gpt-3
- 16jessevig/bertviz★ 8,046 · ⑂ 876
BertViz: Visualize Attention in Transformer Models
- natural-language-processing
- machine-learning
- visualization
- neural-network
- pytorch
- nlp
- 17microsoft/presidio★ 7,969 · ⑂ 1,035
An open-source framework for detecting, redacting, masking, and anonymizing sensitive data (PII) across text, images, and structured data. Supports NLP, pattern matching, and customizable pipelines.
- python
- pii
- privacy
- data-anonymization
- de-identification
- data-masking
- 18lucidrains/PaLM-rlhf-pytorch★ 7,869 · ⑂ 679
Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
- artificial-intelligence
- attention-mechanisms
- deep-learning
- reinforcement-learning
- transformers
- human-feedback
- 19MaartenGr/BERTopic★ 7,587 · ⑂ 893
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
- bert
- transformers
- topic-modeling
- sentence-embeddings
- nlp
- machine-learning
- 20EleutherAI/gpt-neox★ 7,425 · ⑂ 1,111
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries
- deepspeed-library
- gpt-3
- transformers
- language-model
- 21vladmandic/sdnext★ 7,079 · ⑂ 558
SD.Next: All-in-one WebUI for AI generative image and video creation, captioning and processing
- sdnext
- ai-art
- caption
- diffusers
- generative-art
- python
- 22Blaizzy/mlx-audio★ 6,953 · ⑂ 579
A text-to-speech (TTS), speech-to-text (STT) and speech-to-speech (STS) library built on Apple's MLX framework, providing efficient speech analysis on Apple Silicon.
- apple-silicon
- audio-processing
- mlx
- multimodal
- speech-recognition
- speech-synthesis
- 23SkalskiP/courses★ 6,436 · ⑂ 594
This repository is a curated collection of links to various courses and resources about Artificial Intelligence (AI)
- computer-vision
- deep-learning
- deep-neural-networks
- machine-learning
- mlops
- multimodal
- 24rohitg00/ai-engineering-from-scratch★ 6,431 · ⑂ 1,347
Learn it. Build it. Ship it for others.
- agents
- ai
- ai-agents
- ai-engineering
- computer-vision
- course
- 25argosopentech/argos-translate★ 5,941 · ⑂ 447
Open-source offline translation library written in Python
- python
- machine-translation
- transformers
- translation
- language-models
- linux
Find Python engineers shipping Transformers
The list above ranks the most-starred public Python repositories tagged with the Transformers topic, drawn from the public GitHub graph. Across 781 matching repositories, the contributors are a tight cluster of engineers with both Python chops and real Transformers experience.
That overlap is rare. Most Python engineers haven’t shipped Transformers, and most Transformers maintainers don’t write Python. The people on this list’s contributor graph are the ones who do both.
Refolk turns this list into a search. Ask for “Python Transformers maintainers hiring” or “Python engineers shipping Transformers in 2025” and Refolk returns a ranked shortlist with the commits, profiles, and projects behind each name.
How this list is built
Last refreshed: Thu, 07 May 2026 05:54:24 GMT
Need a more specific search?
Refolk runs natural-language searches across GitHub, LinkedIn, and the open web. Try one of these:
Related lists
- Python · Machine learning
- Python · Deep learning
- Python · Computer vision
- Python · Natural language processing
- Python · LLM
- Python · AI agents
- Python · RAG
- Python · Embeddings
See all repository lists.