colugo
colugo t1_j6ifbqy wrote
Reply to [D] DL university research PC suggestions? by seanrescs
Tim Dettmers has your answer.
colugo t1_j2xp63o wrote
Reply to [Discussion] If ML is based on data generated by humans, can it truly outperform humans? by groman434
Think of it more like this: the total intelligence/capability of all humans greatly exceeds the total intelligence/capability of any individual human. When we train ai models, we are imbuing them with capabilities derived from many humans. But once produced, they are easily copied so we could quickly have a population of them, where each individual ai is equivalent to many humans (and thus greater than any individual human) but maybe less capable than all humans. Eventually the population of enough such ais is more capable than the population of humans.
And then, if the ais can train new generations from prior ais, this pattern could repeat and explode.
colugo t1_j1y5k71 wrote
Reply to [D] Is 16gb ram for macbook pro enough for ML?? by peno8
I kind of feel like the answer is, if you are doing the kind of work that needs more RAM, you'd know.
In deep learning in particular, RAM would affect your maximum batch size which could limit how you train models. I'm not sure which particular hard limits you'd come up against in other machine learning. More RAM is helpful, sure, but you can usually use less with more efficient code.
colugo t1_iv5nzoq wrote
Whichever courses seem higher quality in your specific instance. Like is the professor well-regarded? Is the coursework rigorous and relevant? What do people say about the class?
If you've never done calculus, it seems prerequisite for deeper probability/stats, so I'd lean slightly that way.
colugo t1_isr3udi wrote
Makes me think of prehistory and alien worlds!
colugo t1_j8be3q6 wrote
Reply to [D] Quality of posts in this sub going down by MurlocXYZ
It's ChatGPT writing about ChatGPT