[D] Relu + sigmoid output activation Submitted by mrwafflezzz t3_116nm8c on February 19, 2023 at 8:51 PM in MachineLearning 10 comments 2
__lawless t1_j97v07m wrote on February 19, 2023 at 10:43 PM Easiest solution no sigmoid no relu in the last layer just clamp it between 0 and 1. Works surprisingly well Permalink 2
Viewing a single comment thread. View all comments