Submitted by GraciousReformer t3_118pof6 in MachineLearning
yldedly t1_j9jorh1 wrote
Reply to comment by ewankenobi in [D] "Deep learning is the only thing that currently works at scale" by GraciousReformer
It's not from a paper, but it's pretty uncontroversial I think - though people like to forget about the "bounded interval" part, or at least what it implies about extrapolation.
[deleted] t1_j9jsgf6 wrote
What is "bounded interval" here?
yldedly t1_j9judc7 wrote
Any interval [a; b] where a and b are numbers. In practice, it means that the approximation will be good in the parts of the domain where there is training data. I have a concrete example in a blog post of mine: https://deoxyribose.github.io/No-Shortcuts-to-Knowledge/
Viewing a single comment thread. View all comments