Submitted by Zatania t3_xw5hhl in MachineLearning
So I'm doing a thesis paper using BERT and FAISS. Google Colab [haven't tried pro yet] works fine with datasets that are less than 100mb using GPU runtime. But when the dataset is bigger than that, google colab just crashed.
Will colab pro help on this or is there another alternative for this?
Edit: dataset file size that I tried that crashed colab is somewhere around 1gb to 1.5gb.
supreethrao t1_ir4ojh3 wrote
You might want to check your data processing pipeline and maybe optimise how you’re allocation GPU RAM / System RAM. Colab pro will help but I’d suggest that try and optimise the way you deal with you data as colab free tier should easily handle datasets in the few GB range