Submitted by ichiichisan t3_ys974h in MachineLearning
Nameless1995 t1_iw05d36 wrote
There isn't an established standard AFAIK.
EDA is a simple baseline for augmentation: https://arxiv.org/abs/1901.11196
(see citations in google scholar for recent ones).
(Recent ones are playing around with counterfactural augmentation and such but not sure if any standard stable technology has arisen.)
This one had nice low resource performance: https://arxiv.org/pdf/2106.05469.pdf
Also this: https://aclanthology.org/2021.emnlp-main.749.pdf (you can find some new stuff from citations in google scholar/semantic scholar).
I think Prompt Tuning, Contrastive Learning (https://openreview.net/pdf?id=cu7IUiOhujH) did show better very low resource performance too, but, the benefit tapers out as you increase data.
If you are seeking for Adversarial robustness there are also other techniques for that. I think FreeLB was popular a while ago. There's also SAM for flatter minima.
Viewing a single comment thread. View all comments