Submitted by mrwafflezzz t3_116nm8c in MachineLearning
squidward2022 t1_j9au3bg wrote
Reply to comment by mrwafflezzz in [D] Relu + sigmoid output activation by mrwafflezzz
Yup! If you look at the graph of tanh you will see relu(tanh) will smush the left half of the graph to 0. The right half of the graph on (0,infty) ranges in value from 0 and 1 but you can see saturation towards 1 starts to occur around 2-2.5. Since relu leaves this half unchanged you’ll be able to approach 1 very effectively with reasonable finite values.
mrwafflezzz OP t1_j9b414n wrote
Very interesting. Thanks!
Viewing a single comment thread. View all comments