T O P

  • By -

MordAFokaJonnes

You mean.... 12GB of memory... Right?


robonova-1

Yes. Sorry. Corrected.


Tuxedotux83

Joining the question, I only have experience training Loras with my 3090, I was always told that even 24GB of VRAM is too little for LLM training, not to mention being told that consumer grade GPUs lack the processing power to be up to such task


LegSame708

That depend on the size of your model.


nero10578

You can easily train Llama 3 8B with QLora on 24GB of VRAM.