Model Card for Model ID
This is the Q6_K quantized version of IQuest-Coder-7B Instruct in gguf format for more direct use with local chat gui. This is created using Llama.cpp
Thanks to the IQuest Team for releasing this model.
- Downloads last month
- 60
Hardware compatibility
Log In to add your hardware
6-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for Astro-higgs/IQuest-Coder-V1-7B-Instruct-Q6K-gguf
Base model
IQuestLab/IQuest-Coder-V1-7B-Instruct