Submitted by MyActualUserName99 t3_xt80zc in deeplearning
I've been looking into getting a new laptop for personal use as well as for deep/machine learning. I ran across the Lambda Tensorbook laptop (https://lambdalabs.com/deep-learning/laptops/tensorbook/customize) specifically designed for deep learning with the following specs:
Lambda Tensorbook
CPU: 14-core Intel I7
GPU: NVIDIA 3080 TI w/ 16 GB VRAM
RAM: 64 GB 4800 MHz
Memory: 2 TB NVME Gen4 SSD
Display: 15.6" 1440p 240 HZ
System: Ubuntu 22.04 + Windows 10 Pro with pre-installed Tensorflow, CUda, Pytorch, etc.
Price: $5,000
At first glance, it looks really good but the price is a bit high. I started looking at gaming laptops and ran across an Alienware x15 R2 (https://www.dell.com/en-us/shop/dell-laptops/alienware-x15-r2-gaming-laptop/spd/alienware-x15-r2-laptop/wnr2x15cto10ssb) with the following specs:
Alienware x15 R2
CPU: 14-core Intel I9
GPU: NVIDIA 3080 TI w/ 16 GB VRAM
RAM: 32 GB 5200 MHz
Memory: 2 TB NVME Gen4 SSD
Display: 15.6" 1080p 165 HZ
System: Windows 10 Pro
Price: $3,800
Compared against each other, Lambda has a better display (which I don't care much about) and more RAM (32 GB), but is $1,200 more expensive while Alienware also has Intel I9 CPU. Although the pre-installations are nice, I can always download tensorlfow, pytorch, and cuda manually for Alienware. I personally don't think I'll be needing the extra 32 GB of RAM as I can always use cloud services for very large projects, so it seems that the Alienware best fits my needs.
Are there any downsides to using a game laptop for deep learning rather than a specifically designed deep learning laptop? Is 64 GB of RAM really needed for most moderate deep/machine learning projects or should 32 GB with a reduced batch size work fine?
cma_4204 t1_iqogqbg wrote
I have a msi laptop with a rtx3070 and 32gb of ram. It is plenty for experimenting and training small models/datasets. Once I have something working and need more power I rent gpus by the hour from lambda labs or run pod. For $1-2/hr I’m able to get around 80gb of gpu power. Long story short I would go for the cheaper and use the cloud when needed