Submitted by sifarsafar t3_zyhy0s in deeplearning

I am an MS Data Analytics student. During the assignment (image classification of MNIST dataset using CNN) I received, my current laptop (Specs - i5 10th gen, 8GB RAM, 512 SSD, OS - Windows) faced some problems and took longer to train models. I am thinking of buying a new laptop for the upcoming semester which will include modules like Scalable Machine Learning and Modelling and Simulation of Natural Systems plus my dissertation. I am locked on two choices -

  1. MacBook M1 Air with 16GB RAM and 512GB SSD
  2. MacBook M2 Pro with 16GB RAM and 256GB SSD

I have mainly chosen apple devices because my other devices (phone and iPad) are from Apple too and would benefit from the Apple ecosystem. Also, the Macs are lighter and have a longer battery life, providing great portability.

Still, if you think I should go with a windows laptop, please suggest that too.

0

Comments

You must log in or register to comment.

suflaj t1_j25xvso wrote

Unless you have money for an actual workstation like the top end Dell Precisions, Thinkpads or Razer Blades, you should probably not get a laptop to do deep learning on. Those macs will not be that much faster than running stuff on CPU, even if you do get some Metal API to run on it.

8

bacocololo t1_j25zlms wrote

Wrong i have a macbook pro 14 m1 and it s faster than my 3080

−5

cma_4204 t1_j264tc8 wrote

I have a msi ge66 raider my work got me, it’s 32gb ram rtx3070 8gb, 1TB SSD. I use it for PyTorch and when I max it out I just use runpod or lambda labs to rent a rtx3090 or a100 for dirt cheap. Would recommend it

4

elbiot t1_j26egmm wrote

Keep the laptop, buy a used gaming desktop with a 3060 12GB VRAM and ssh into it from your laptop.

11

chengstark t1_j26evbj wrote

Don’t bother, ask department for cluster resources, none of these will be enough for actual fast DL works.

6

Final-Rush759 t1_j2783cm wrote

Rtx 30070 or waiit for 4070. Then a laptop with good cooling system. Usually a gaming laptop with many holes in the botton and two fans inside , that put air out in the front or sides of the laptop. Then, make linux OS drive at the second SSd slot and have 32 GB or more RAM.

2

DavIantt t1_j2785s5 wrote

Can you get at the RAM? If so, you can upgrade it relatively easily (and cheaply). Most laptops really need 16 to 32 GB anyway to run efficiently.

1

_-K1L4-_ t1_j28ogmu wrote

Why not just use Google colab?

1

Natalia_Moon t1_j29rxfy wrote

You should use google colab or pay Amazon services

1

ShadowStormDrift t1_j29wqlt wrote

I have a Mac M1 Pro. Given to me by my work.

DO NOT. I REPEAT. DO NOT USE A MAC TO DO DEEP LEARNING.

You will not have a good time.

Their decision to go with their own architecture (One chip as CPU and GPU) has completely gimped them in this space.

Most popular DL frameworks ship with CUDA. Cuda is controlled by Nvidia. Native M1 chips are not compatible with CUDA.

This means by doing DL on a Mac you are locking yourself out of the entire DL ecosystem.

Additionally, they (Apple) are also highly restrictive upon what they do and do not allow on their eco system leading to a VERY restrictive development environment. Seriously, getting something like OpenRefine working on a Mac was not possible due to their stance of "Only authorized programs may be installed here". At the time of my attempt, OpenRefine, a highly popular framework for inspecting massive CSV files, was not authorized on the new Mac M1 series.

Sure they may eventually deign to authorize something as popular as OpenRefine... but frankly you will be better off getting actual work done instead of waiting for a company to realize that nobody is big enough to police the entirety of the internet.

2

ralphc t1_j2a17xo wrote

What's your budget?

I have a Dell Alienware 15" laptop with a rtx 3080 Ti, 16 GB of GPU memory, that does well on deep learning, tensorflow etc.

With Windows 11 on it you can set up WSL 2 and run graphical Linux programs. CUDA has a WSL-specific setup to get to the GPU and the rest is easy to set up.

It looks like you can get one in the $2500-3000 range, that's why I asked about your budget.

1

gelvis101 t1_j2asbrb wrote

PaperSpace is good for ML too, just SSH into it from your laptop.

1