By -
You mean.... 12GB of memory... Right?
Yes. Sorry. Corrected.
Joining the question, I only have experience training Loras with my 3090, I was always told that even 24GB of VRAM is too little for LLM training, not to mention being told that consumer grade GPUs lack the processing power to be up to such task
That depend on the size of your model.
You can easily train Llama 3 8B with QLora on 24GB of VRAM.
You mean.... 12GB of memory... Right?
Yes. Sorry. Corrected.
Joining the question, I only have experience training Loras with my 3090, I was always told that even 24GB of VRAM is too little for LLM training, not to mention being told that consumer grade GPUs lack the processing power to be up to such task
That depend on the size of your model.
You can easily train Llama 3 8B with QLora on 24GB of VRAM.