[D] Relu + sigmoid output activation Submitted by mrwafflezzz t3_116nm8c on February 19, 2023 at 8:51 PM in MachineLearning 10 comments 2
bremen79 t1_j97sb9r wrote on February 19, 2023 at 10:24 PM The sigmoid will make effectively very hard for the network to produce values close to 1, because it would require a pre activation value close to infinity. Would this be a good behavior in your application? Permalink 4
Viewing a single comment thread. View all comments