asarig_

asarig_ OP t1_j73g1ne wrote

Reply to comment by gdpoc in [R] Graph Mixer Networks by asarig_

Thanks for your interest. If you open an issue on GitHub about this, I will keep it in mind as a reminder, and I can share pre-trained weights at the appropriate time.

2

asarig_ OP t1_j71wbqs wrote

Reply to comment by SatoshiNotMe in [R] Graph Mixer Networks by asarig_

Of course, MLP-Mixers is a new approach first developed as image classification and was developed independently by Google and Oxford researchers in May 2021.

The MLP-Mixer, also known simply as "Mixer", is a type of image architecture that doesn't incorporate convolutions or self-attention. Instead, it relies solely on the use of multi-layer perceptrons (MLPs) that are repeatedly applied either to different spatial locations or feature channels.

Instead of Transformers, which are normally applied on the Graph, in this work, I tried to use Mixers as a new kernel method on graphs, which aims to find out how it performs with linear complexity, avoiding the O(n***^(2)***) complexity of Transformers

5