Submitted by C_l3b t3_10wrlrm in MachineLearning
Hi, I want to open a thread about RL (non-deep and deep)
What are the papers/books that are "must read" to have strong foundation?
Submitted by C_l3b t3_10wrlrm in MachineLearning
Hi, I want to open a thread about RL (non-deep and deep)
What are the papers/books that are "must read" to have strong foundation?
This is an incredible resource, thank you!
This really good, thanks!
Is this also kind of a course?
Well.. kind of. Now for courses I would recommend Silver's course, followed by Levine's course, which are both available on youtube (besides reading the Sutton-Barto book). But besides the reading list, it also provides a detailed explaination of the most important model-free algorithms, as well as code implementations that are supposed to be as easy to understand as possible. Now if you want performent code for research/personal projects, I would not recommend SpinningUp, but it is a great way to learn how they are implemented.
Thank you so much for your reply.
I did take a look at your suggestions. The Sergey Levine course seems really awesome, It is definitely on the list to do. But I'm a little bit conflicted regarding the David Silver course. I have a few follow-up questions, if you don't mind:
I'm currently doing Hugging Face Deep RL course. It is free and at the end I get a certification of completion, not that I care much about the certification, but it is always nice to get a certificate.
Also, isn't the David Silver course a little outdated? It seems that the video lectures were made 7 years ago. I guess the basics don't change, but I was wondering what is the best course to take. I think they are around the same lvl of difficulty. What would you choose?
Can't really speak for Hugging Face. It seems to touch on relatively advanced topics and challenging tasks. It certainly looks nice from a practitoner's side, which is very useful to learn the various tricks to make RL work.
Regarding Silver's course, it is a bit outdated indeed, but the focus is more on the basics of RL, whereas Levine focuses on deep RL and assumes a good understanding of the basics.
Now, there are some topics in Silver's course which are a bit outdated (e.g. TD(lambda) with eligibility traces or linear function approximation) which would be better replaced by other topics in more modern courses, typically DQN or AlphaGo (UCL has also a more recent series, which touches on Deep RL). But Silver's explainations are very instructive and is one of the best taught university courses I have seen (in general). I would for sure at least watch the first few lectures.
Thank you so much for all your answers.
I'm so sorry to bother you again. Just one final question.
Do you know if the Spinning up algos are worth while? Since I'm on Windows it seems to be a little more changeling to install it in my local machine. Is there an alternative to installing on local machine like colab?
Not really, I think the main strength of the library is that it is designed to be easy to understand how the algorithms are implemnted. At the time, the main alternative was OpenAI/Stable baselines, which was quite obscure to understand how the algorithms are implemented. On the other hand, the algorithms do not use some more advanced tricks that enhance performance
However, there are better libraries now. In the same spirit, there is CleanRL, that is clean (with algorithms in one file) , but also performent. If you are looking for a modular easy-to-use library, I would recommend Stable Baselines3
Thanks!
You can follow along CMU’s deep RL course, they post which research papers they go over here: https://cmudeeprl.github.io/703website_f22/lectures/
I woukd start with kaggle's RL course. Its a good into and has links to david silver's lecture series and sutton and barto's text book. Both are excellent intoductions to rl theory
That kind of depends where you're starting from. What level are you at now?
I took courses about Machine Learning and Deep Learning at uni.
Sutton and Barto is obligatory if you want to learn RL, imo. Even as an experienced researcher, I read it every year or so. It's very approachable.
That looks really cool, thx
.
You should create a github repo first.
sonofmath t1_j7p0ml4 wrote
Not up to date, but a solid basis is Spinning Up