Submitted by AlmightySnoo t3_117iqtp in MachineLearning
Optimal-Asshole t1_j9c20cy wrote
I think these workshops accept every submission that is not incoherent or desk rejected.
From my quick glance, It doesn’t seem like plagiarism, since they do ample citation. As far as the justification goes, there are some generative based approaches for solving parametric PDEs even now. It doesn’t seem like the best paper ever, but I don’t think it’s that bad.
AlmightySnoo OP t1_j9c2trd wrote
>It doesn’t seem like plagiarism, since they do ample citation.
It is when you are pretending to do things differently while in practice you do the exact same thing and add a useless layer (the GAN) to give the false impression of novelty. Merely citing sources in such cases doesn't shield you from being accused of plagiarism.
>As far as the justification goes, there are some generative based approaches for solving parametric PDEs even now.
Not disputing that there might be papers out there where the use is justified, of course there are skilled researchers with academic integrity. But again, in this paper, and the ones I'm talking about in general, the setting is exactly as in my 2nd paragraph, where the use of GANs is clearly not justified at all.
>but I don’t think it’s that bad
Again, in the context of my second paragraph (because that's literally what they're doing), it is bad.
Optimal-Asshole t1_j9c4h8d wrote
Okay lol so I’m actually researching kinda similar things and I assumed this paper was related because it used similar tools but upon a closer look, nope nvm. It’s not even using the generative model for anything useful.
So their paper just shows that the basic idea of least squares PDE solving can be used for generative models. Okay now it’s average class project tier. I guess this demonstrates that yes these workshops accept literally anything.
Edit: it’s still not plagiarism. It’s just not very novel. Plagiarism is stealing ideas without credit. What they did was discuss an existing idea and extend it in a very small way experimentally only. Not plagiarism.
vikumwijekoon97 t1_j9fho30 wrote
I was looking into similar things in my undergrad thesis. My math wasn't great so I couldn't comprehend much. Are there actual NN methods that can PDEs without depending on the initial conditions? I was looking into soft body physics simulation using gpus.
Optimal-Asshole t1_j9fktzg wrote
> Are there actual NN methods that can PDEs without depending on the initial conditions?
The initial condition needs to be known (but we can actually have some noisy initial condition, like measurements corrupted by noise [1]), but NN based models can efficiently solve some parametric PDEs faster than traditional solvers. [2]
There is also a lot of work in training NNs on data generated from traditional methods, and this can be combined jointly with the above method to solve a whole class of problems at once. [3]
Solving a whole parametric family of PDEs (i.e. a parameterized family of initial conditions) and handling complicated geometries will be the next avenue of this specific field IMO. Actually it is being actively worked on.
[1] https://arxiv.org/abs/2205.07331
vikumwijekoon97 t1_j9fma3t wrote
That's pretty awesome
AlmightySnoo OP t1_j9c56bx wrote
>It’s not even using the generative model for anything useful.
Thank you, that's literally what I meant in my second paragraph. They're literally training the GAN to learn Dirac distributions. The noise has no use, and the discriminator eventually ends up learning to do roughly the job of a simple squared loss.
Viewing a single comment thread. View all comments