Submitted by Zealousideal-Copy463 t3_10khmxo in deeplearning
Zealousideal-Copy463 OP t1_j5tk24z wrote
Reply to comment by FuB4R32 in Best cloud to train models with 100-200 GB of data? by Zealousideal-Copy463
Ohh, I didn't know that about GCP, so you can point a VM to a bucket and it just "reads" the data? you don't have to "upload" data into the VM?
As I said in a previous comment, my problem with AWS (S3 and Sagemaker), is that the data is in a different network, and even though is still an AWS network, you have to move data around and that takes a while (when it's 200 GB of data).
FuB4R32 t1_j5v9mlr wrote
Yeah as long as your VM is in the same region as the bucket it should be fine. Even if you have 200GB it doesn't take that long to move between regions either
Zealousideal-Copy463 OP t1_j5ylls6 wrote
Thanks a lot, gonna try it!
Viewing a single comment thread. View all comments