ab3rratic
ab3rratic t1_jegoldu wrote
Reply to Should I continue with this? by Eric-Cardozo
If you are still learning, you can continue. But with the knowledge that your library likely won't be competitive with what's already out there.
ab3rratic t1_jcorj10 wrote
Reply to [Discussion] Future of ML after chatGPT. by [deleted]
There is life outside of NLP and CV.
ab3rratic t1_jakzbv6 wrote
Reply to [D] Are Genetic Algorithms Dead? by TobusFire
GAs are not great for expensive-to-evaluate functions. And those have become kind of relevant lately.
ab3rratic t1_izh1s2j wrote
Reply to comment by IdeaEnough443 in [D] What is the recommended approach to training NN on big data set? by IdeaEnough443
See "deep learning".
ab3rratic t1_izgrfxy wrote
Reply to comment by IdeaEnough443 in [D] What is the recommended approach to training NN on big data set? by IdeaEnough443
Batch gradient descent (the usual method) does not require the entire dataset to fit into memory -- only one batch, as it were.
ab3rratic t1_jegpq3d wrote
Reply to comment by Ok_Development1633 in Should I continue with this? by Eric-Cardozo
It might be. But then tensorflow/pytorch are out there, too, have documentation and have CPU-only modes.
I say this as someone who coded a number of numeric algorithms (BFGS, Nelder-Mead, etc) from scratch just to understand them, while knowing all along I wasn't competing with real OR libs. On a few occasions, my own implementations were handy for proof-of-concept work when adding 3rd party libs was a hassle.