Submitted by AutoModerator t3_11pgj86 in MachineLearning
ilrazziatore t1_jch3vpu wrote
Reply to comment by LeN3rd in [D] Simple Questions Thread by AutoModerator
Uhm..... the bnn are built assuming distribution both on th parameters( ie the value assumed by the neurons weights) and on the data (the last layer has 2 outputs : the predicted mean and the predicted variance. Those 2 values are then used to model the loss function which is the likelihood and is a product of gaussians. I think its both model and data uncertainty.
Let's say I compare the variances and the mean values predicted.
Do I have to set the same calibration and test dataset apart for both models or use the entire dataset? The mcmc model can use the entire dataset without the risk of overfitting but for the bnn it will be like cheating
LeN3rd t1_jchht71 wrote
Than I would just use a completely different test dataset. In a paper I would also expect this.
ilrazziatore t1_jciyif4 wrote
Eh data are scarce, I have only this dataset ( it's composed by astrophysical measures, I cannot ask them to produce more data).
Viewing a single comment thread. View all comments