ZachVorhies
ZachVorhies t1_j1zx7ci wrote
Reply to comment by twohusknight in [D] Is 16gb ram for macbook pro enough for ML?? by peno8
If your target are small ML models then it will run on anything. If you want to do serious stuff like whisper AI you’ll need an nvidia graphics card or you’ll be running 10x slower.
ZachVorhies t1_j1yagxw wrote
Reply to [D] Is 16gb ram for macbook pro enough for ML?? by peno8
Macs are not good for machine learning. What’s good for machine learning is an nvidia graphics card of 8gb or higher. So that’s going to be Windows or Linux only.
With a mac you’ll only be able to run in cpu mode. And right now many models don’t support mac at all and in some cases will require specific packages that are compiled to m1. It’s kind of a nightmare.
Edit: Downvote me all you want haters. I use ML apis to make apps that run on linux, mac and windows and Mac only has cpu inference, not CUDA acceleration and are therefore 10x slower. Your downvotes are a giant fanboy cope.
ZachVorhies t1_j1zxoyr wrote
Reply to comment by peno8 in [D] Is 16gb ram for macbook pro enough for ML?? by peno8
Pytorch does have preview builds to run on the Mac M1 CPU. But they can’t ever offer CUDA acceleration because these macs don’t have nvidia graphics cards.