How many GPU's are required to fine tuning bge-m3 over 1 million tripplets ?
Congrulation to all the team of BAAI for the excellent work!
Actually I am collecting 1 million of tripplets (query, list[pos] , list[neg] ). Now, I wonder how many GPU's are required for the fine tuning?
Any suggestion is welcomed friends.
Thanks for your interest in our work! I think 8*A100 is enough.
@wilfoderek were you able to finetune the model. I fine-tuned model and is now giving me .9995 similarity score for everything no matter what the string is. I must have goofed up the training process I guess.
@wilfoderek were you able to finetune the model. I fine-tuned model and is now giving me .9995 similarity score for everything no matter what the string is. I must have goofed up the training process I guess.
Still working on collecting data! But I see , as you describe your problem might to be relationated with overfitting.