Viewing a single comment thread. View all comments

huehue12132 t1_j9e9xqf wrote

GANs can be useful as alternative/additional loss functions. E.g. the original pix2pix paper: https://arxiv.org/abs/1611.07004 Here, they have pairs (X, Y) available, so they could just train this as a regression task directly. However, they found better results using L1 loss plus a GAN loss.

Keep in mind that using something like squared error loss has a ton of assumptions underlying it (if you interpret training as maximum likelihood estimation) such as outputs being conditionally independent and following a Gaussian distribution. A GAN discriminator can represent a more complex/more appropriate loss function.

Note, I'm not saying that a lot of these papers might not add anything of value, but there are reasons to use GANs even if you have known input-output pairs.

15

[deleted] t1_j9f1cgt wrote

[removed]

2

notdelet t1_j9g627c wrote

> Assuming Gaussianity and then using maximum likelihood gives yields an L2 error minimization problem.

Incorrect, only true if you fix the scale parameter. I normally wouldn't nitpick like this but your unnecessary usage of bold made me.

> (if you interpret training as maximum likelihood estimation)

> a squared loss does not "hide a Gaussian assumption".

It does... if you interpret training as (conditional) MLE. Give me a non-Gaussian distribution with an MLE estimator that yields MSE loss. Also, residuals are explicitly not orthogonal projections whenever the variables are dependent.

0

notdelet t1_j9gija3 wrote

In the future know that blocking someone after replying to them prevents them from responding to your reply. This means you are giving the false impression I am not responding to you (but can) to those who are not one of us.

1