Calculate VRAM needed for training and inference of HF models
Check if your GPU can run a selected LLM model
Explore and submit models for benchmarking
Calculate VRAM needed for training and inference of HF models
Check if your GPU can run a selected LLM model
Explore and submit models for benchmarking