Submitted by mrwafflezzz t3_116nm8c in MachineLearning
mrwafflezzz OP t1_j99f9kl wrote
Reply to comment by squidward2022 in [D] Relu + sigmoid output activation by mrwafflezzz
Will it be able to approach 1 somewhat effectively as well?
squidward2022 t1_j9au3bg wrote
Yup! If you look at the graph of tanh you will see relu(tanh) will smush the left half of the graph to 0. The right half of the graph on (0,infty) ranges in value from 0 and 1 but you can see saturation towards 1 starts to occur around 2-2.5. Since relu leaves this half unchanged you’ll be able to approach 1 very effectively with reasonable finite values.
mrwafflezzz OP t1_j9b414n wrote
Very interesting. Thanks!
Viewing a single comment thread. View all comments