Submitted by samobon t3_1040w4q in MachineLearning
geeky_username t1_j331xz0 wrote
Reply to comment by ZaZaMood in [News] AMD Instinct MI300 APU for AI and HPC announced by samobon
"Those that fail to learn from history are doomed to repeat it."
Especially on the software side, AMD has a habit of releasing something and then not doing much for continued support, expecting the community to foot the labor
HippoLover85 t1_j35en7z wrote
Previously amd didnt have the budget for it. They do now and have really only had it the last two-ish years.
Will they now put resources towards it? I hope so. But it also appears amd is trying to get products in mega dc/supercomputer applications and spreading use that way.
zeyus t1_j338cuu wrote
Isn't their continued support one of the selling points for AM5? That they supported previous gen for ages and they plan to again
geeky_username t1_j33cic6 wrote
Software.
Having AI compute hardware is rather pointless without the supporting software.
Nvidia has an entire CUDA ecosystem for developers to use
zeyus t1_j33nspg wrote
Absolutely agree, it's been a while since I've had AMD hardware, but I'd consider it again (especially CPU)...I just haven't been aware of specific issues with software either, I mean Intel, AMD and Nvidia all have had bugfixes and patching with drivers and firmware. Is there something I've missed about AMD and software?
BTW, I haven't had enough disposable income to upgrade so I've been stuck on 4590K for about 6 years and I hate my motherboard software (that's Asus bloatware) and had so much trouble getting the NVMe to work and RAID...but once I did it's been OK, and the 1070 I have is getting a bit to small for working with ML/AI, but what can you do...it still runs most newish games too.
geeky_username t1_j358k76 wrote
>Is there something I've missed about AMD and software?
They have this https://gpuopen.com/
Which seems great in theory, but some of that hasn't been touched in a long time.
Radeon Rays: May 2021
They'll release something, do a bunch of initial work on it, and then it fades
zeyus t1_j363mxu wrote
Well that is a genuine shame, nvidia really needs some competition in this space. I'm sure plenty of researchers and enthusiasts would happily use some different hardware (as long as porting was easy) I've written some CUDA C++ and it's not bad. Manufacturer-specific code always feels a bit gross, but the GPU agent based modeling framework I was using was strictly CUDA.
ZaZaMood t1_j39c7ot wrote
Nvidia needs some competition fr fr. I can't even consider buying AMD because the entire data science community has pinned to CUDA
Viewing a single comment thread. View all comments