Viewing a single comment thread. View all comments

peno8 OP t1_j1y5vrf wrote

Hey, thanks for the reply.

I know using macbook for DL is kind of unusual, so for DL I will use Google Colab or buy desktop. So for DL I will use my laptop for feature calculation, and the batch size will not be a problem to me.

3

barvazduck t1_j1yl9s5 wrote

I work on LLMs and it's my setup (I have M1 32gb, though it has little influence as everything is done on colab/server jobs).

Local decice can have more influence if you plan to use the laptop for testing/inference colab made model.

1