view article Article GGML and llama.cpp join HF to ensure the long-term progress of Local AI +4 3 days ago • 309
view post Post 3053 🙋🏻♂️hello my lovelies , it is with great pleasure i present to you my working one-click deploy 16GB ram completely free huggingface spaces deployment.repo : Tonic/hugging-claw (use git clone to inspect)literally the one-click link : Tonic/hugging-clawyou can also run it locally and see for yourself :docker run -it -p 7860:7860 --platform=linux/amd64 \ -e HF_TOKEN="YOUR_VALUE_HERE" \ -e OPENCLAW_GATEWAY_TRUSTED_PROXIES="YOUR_VALUE_HERE" \ -e OPENCLAW_GATEWAY_PASSWORD="YOUR_VALUE_HERE" \ -e OPENCLAW_CONTROL_UI_ALLOWED_ORIGINS="YOUR_VALUE_HERE" \ registry.hf.space/tonic-hugging-claw:latest just a few quite minor details i'll take care of but i wanted to share here first See translation 2 replies · 🔥 9 9 + Reply