Submitted by moekou t3_z4lpry in deeplearning
notgettingfined t1_ixrnwuu wrote
I dual boot. Iām sure you could get things to work on Windows but it would be a horrible experience.
jazzzzzzzzzzzzzzzy t1_ixs06fo wrote
I always found this weird. Getting Nvidia drivers to work on Linux is a big pain, yet most ML people use it. I use windows and WSL if I have to. I never go into my Linux dual boot.
VinnyVeritas t1_ixs2kfa wrote
Maybe it was a pain years ago, I don't really know but nowadays you just click install nVidia drivers in the software management and it works. There's nothing painful or difficult about it.
Rephil1 t1_ixsa6jj wrote
Standard driver + cuda. Get yourself a docker container and your good to go š this is probably the easiest way to do it
jazzzzzzzzzzzzzzzy t1_ixtoaub wrote
Okay cool. I will try this out.
Comfortable-Author t1_ixsdaz5 wrote
Nvidia drivers for "simply" compute workloads are really good, just look at all the server with Nvidia GPUs. The problem is when you need to output an image to a screen and deal with Wayland or X11
Longjumping-Wave-123 t1_ixvf7jl wrote
Lambda Stack ftw
Viewing a single comment thread. View all comments