tiiuae/Falcon-H1-Tiny-90M-Instruct-pre-DPO Text Generation • 91.1M • Updated about 14 hours ago • 28 • 2
Falcon-H1-Tiny Collection A series of extremely small, yet powerful language models redefining capabilities at small scale • 22 items • Updated about 16 hours ago • 22
Falcon-H1R: Pushing the Reasoning Frontiers with a Hybrid Model for Efficient Test-Time Scaling Paper • 2601.02346 • Published 11 days ago • 25
Learnable Multipliers: Freeing the Scale of Language Model Matrix Layers Paper • 2601.04890 • Published 8 days ago • 40
Learnable Multipliers: Freeing the Scale of Language Model Matrix Layers Paper • 2601.04890 • Published 8 days ago • 40
Falcon-H1: A Family of Hybrid-Head Language Models Redefining Efficiency and Performance Paper • 2507.22448 • Published Jul 30, 2025 • 69
Don't Overthink it. Preferring Shorter Thinking Chains for Improved LLM Reasoning Paper • 2505.17813 • Published May 23, 2025 • 58
Falcon-H1 Collection Falcon-H1 Family of Hybrid-Head Language Models (Transformer-SSM), including 0.5B, 1.5B, 1.5B-Deep, 3B, 7B, and 34B (pretrained & instruction-tuned). • 39 items • Updated 7 days ago • 59
view article Article Falcon-H1: A Family of Hybrid-Head Language Models Redefining Efficiency and Performance May 21, 2025 • 38
view article Article Falcon-H1: A Family of Hybrid-Head Language Models Redefining Efficiency and Performance May 21, 2025 • 38