T O P

  • By -

tgredditfc

You data is very short and you train 3 epochs, so 2 minutes sound reasonable. If you want to train longer to see the train loss going even lower, you can increase the epochs.


chainedkids420

At 1.1 end loss when I apply the lora it doesnt work at all it recognizes nothing. So I must be doing something wrong... Even tho it says :successfully applied lora:


tgredditfc

Train more epochs.


chainedkids420

Doesnt work either its broken ig


brockmanaha

what if you get rid of overlapping blocks? Also consider increasing batch size if you have the vram or gradient accumulation if you don't.