dhruvdh
dhruvdh t1_ispgx2z wrote
Reply to comment by Blasket_Basket in rx6900xt for ML? [D] by Moppmopp
Please don't create misinformation by stating your opinion as fact.
dhruvdh t1_ispgllz wrote
Reply to comment by Moppmopp in rx6900xt for ML? [D] by Moppmopp
It is potentially enough. But most material on the internet assumes you a CUDA device, so as a novice it would make sense to take the path of least resistance.
If you do not have an option, look into https://www.amd.com/en/graphics/servers-solutions-rocm-ml. It won't explicitly say your card is supported but it should run fine.
ROCm ML is supported only on linux, as far as I know.
dhruvdh t1_is3wryt wrote
Reply to comment by Batuhan_Y in [Project] I've built an Auto Subtitled Video Generator using Streamlit and OpenAI Whisper, hosted on HuggingFace spaces. by Batuhan_Y
I am not familiar with huggingface spaces, but do you intend to share source on github or colab?
dhruvdh t1_ispjncc wrote
Reply to comment by Moppmopp in rx6900xt for ML? [D] by Moppmopp
Have you considered not buying a GPU at all and making using of paid services from Google Colab, lambdacloud, etc.
You can use these while you learn, and learn more about your requirements, and make a more educated decision later.
Colab free tier works great for short experiments, and next up tier is just 10$ a month.
AMD is also set to announce new GPUs on November 3, depending on their price all last gen prices should go down.