Sharded GGUF version of mradermacher/LFM2.5-1.2B-Thinking-i1-GGUF.
- Downloads last month
- 23
Hardware compatibility
Log In to add your hardware
4-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for Felladrin/gguf-sharded-Q4_K_S-LFM2.5-1.2B-Thinking
Base model
LiquidAI/LFM2.5-1.2B-Base
Finetuned
LiquidAI/LFM2.5-1.2B-Thinking