Submitted by Kaudinya t3_y25fjb in MachineLearning
Running ML workflows involves several hurdles. You connect to a machine through SSH, install the CUDA driver, fetch your code, copy the data, build a docker image, run the script, watch the process, etc. Finally, if the machine is a cloud instance, stop it.The other alternative is to use end-to-end platforms - open source or enterprise ones.
In an attempt to possibly simplify it, we open-sourced a tool that allows running ML workflows from CLI but they would actually run in the cloud and takes care of - provisioning infrastructure, setting up the environment, etc. Would be glad to get your feedback on the project [github.com/dstackai/dstack]. See the link in the comment. Many thanks
Kaudinya OP t1_is0woi5 wrote
Here is the link to a post where you can see how it works
https://mlopsfluff.dstack.ai/p/simplifying-the-mlops-stack