Submitted by GraciousReformer t3_118pof6 in MachineLearning
GraciousReformer OP t1_j9j8bsl wrote
Reply to comment by VirtualHat in [D] "Deep learning is the only thing that currently works at scale" by GraciousReformer
Thank you. I understand the math. But I meant a real world example that "the solution is not in the model class."
VirtualHat t1_j9j8uvr wrote
For example, in IRIS dataset, the class label is not a linear combination of the input. Therefore, if your model class is all linear models, you won't find the optimal or in this case, even a good solution.
If you extend the model class to include non-linear functions, then your hypothesis space now at least contains a good solution, but finding it might be a bit more trickly.
GraciousReformer OP t1_j9jgdmc wrote
But DL is not a linear model. Then what will be the limit of DL?
terminal_object t1_j9jp51j wrote
You seem confused as to what you yourself are saying.
GraciousReformer OP t1_j9jppu7 wrote
"Artificial neural networks are often (demeneangly) called "glorified regressions". The main difference between ANNs and multiple / multivariate linear regression is of course, that the ANN models nonlinear relationships."
PHEEEEELLLLLEEEEP t1_j9k691x wrote
Regression doesnt just mean linear regression, if that's what you're confused about
Acrobatic-Book t1_j9k94l4 wrote
The simplest example is the xor-problem (aka either or). This was also why multilayer perceptrons as the basis of deep learning where actually created. Because a linear model cannot solve it.
Viewing a single comment thread. View all comments