lmericle

lmericle t1_jcyxiex wrote

Well, no, it isn't. You are looking for machine learning research. That list is only about LLMs, a very specific and over-hyped sub-sub-application of ML techniques.

If all you want is to attach yourself to the hype cycle, then that link still won't be enough, but at least it's a start.

0

lmericle t1_jcln487 wrote

You will find that in hype circles such as NLP there's a lot of thought-terminating cliches passed around by people who are not so deep in the weeds. Someone says something with confidence, another person doesn't know how to vet it and so just blindly passes it on, and all of a sudden a hack becomes a rumor becomes dogma. It seems to me to be this way with context vs memory.

Put another way: it's the kind of attitude that says "No, Mr. Ford, what we wanted was faster horses".

7

lmericle t1_jannla8 wrote

The trick with genetic algorithms is you have to tune your approach very specifically to the kinds of things you're modelling. Different animals mate and evolve differently, in the analogical view.

It's not enough to just do the textbook "1D chromosome" approach. You have to design your "chromosome", as well as your "crossover" and "mutation" operators specifically to your problem. In my experience, the crossover implementation is the most important one to focus on.

2