Viewing a single comment thread. View all comments

ZachVorhies t1_j1yagxw wrote

Macs are not good for machine learning. What’s good for machine learning is an nvidia graphics card of 8gb or higher. So that’s going to be Windows or Linux only.

With a mac you’ll only be able to run in cpu mode. And right now many models don’t support mac at all and in some cases will require specific packages that are compiled to m1. It’s kind of a nightmare.

Edit: Downvote me all you want haters. I use ML apis to make apps that run on linux, mac and windows and Mac only has cpu inference, not CUDA acceleration and are therefore 10x slower. Your downvotes are a giant fanboy cope.

https://github.com/zackees/transcribe-anything

−3

peno8 OP t1_j1yc0lr wrote

Hm I saw many people doing ML on macbook and they say it's doable...

Are you talking about like this? https://scikit-learn.org/stable/install.html#installing-on-apple-silicon-m1-hardware

I know pytorch is not available for apple silicon at the moment and it's ok to me because if I'm not wrong pytorch is more about DL and I will do it from Colab or dedicated desktop.

It would be great if you can show me a example about what kind of other models do not support mac, then I will check and see it's a deal breaker for me.

2

ZachVorhies t1_j1zxoyr wrote

Pytorch does have preview builds to run on the Mac M1 CPU. But they can’t ever offer CUDA acceleration because these macs don’t have nvidia graphics cards.

1

twohusknight t1_j1yg7md wrote

I regularly do ML professionally and as a hobby on a 2015 MacBook Pro, where I can do computer vision with DL in inference on a CPU. It’s plenty fast enough for PGMs, SVMs, decision trees, etc. Not every problem requires a GPU, but if I’m doing anything at scale or that requires power and speed then I’ll just setup a cloud instance.

I also have a M1 Mac mini that sits on my desk that I remote login to that I have used for GPU-based DL training and inference. It’s great for small experiments and writing the code, but cloud instances are the way to go until the US silicon fabs start opening and prices drop.

1

ZachVorhies t1_j1zx7ci wrote

If your target are small ML models then it will run on anything. If you want to do serious stuff like whisper AI you’ll need an nvidia graphics card or you’ll be running 10x slower.

0