Submitted by thomasahle t3_118gie9 in MachineLearning
thomasahle OP t1_j9kapw7 wrote
Reply to comment by ChuckSeven in Unit Normalization instead of Cross-Entropy Loss [Discussion] by thomasahle
Even with angles you can still have exponentially many vectors that are nearly orthogonal to each other, if that's what you mean...
I agree the representations will be different. Indeed one issue may be that large negative entries will be penalized as much as large positive ones, which is not the case for logsumexp...
But on the other hand more "geometric" representations like this, based on angles, may make the vectors more suitable for stuff like LSH.
Viewing a single comment thread. View all comments