Submitted by eternalmathstudent t3_z7z9gg in deeplearning
carbocation t1_iyb3f36 wrote
While convolution is a bit funky with tabular data (what locality are you exploiting?), I think that attention is a mechanism that might make sense in the deep learning context for tabular data. For example, take a look at recent work such as https://openreview.net/forum?id=i_Q1yrOegLY (code and PDF linked from there).
eternalmathstudent OP t1_iyb3ny8 wrote
I did not want to use resnet as is, I'm not requiring the convolutional layer itself. I'm looking for general purpose residual blocks with skip connections
carbocation t1_iybb5a8 wrote
Yes, which is why I think you’ll find that link of particular interest since they comment on it (and attention).
rjog74 t1_iybbf02 wrote
Any particular reason why resnet only and looking for general purpose residual blocks
Viewing a single comment thread. View all comments