Submitted by MichelMED10 t3_y9yuza in MachineLearning
Hello,
While doing some tests, I saw that XGBOOST is way better than a multilayer NN classifier for classifcation.
So I thought that first training a CNN/Transformer as a backbone with a "normale" classifier as a head for any classfication/regression task then freezing the backbone and training an XGBOOST for classfication is a good Idea.
But none of the new papers do that and they all tend to use a linear/multiplayer NN classifier.
Anyone know why ?
Thanks !
patrickSwayzeNU t1_it8irde wrote
This post will likely get deleted. You should post in r/learnmachinelearning
Xgboost tends to be best on tabular data - you didn’t mention what domain you’re working on.
Creating entity embeddings from NNs and passing them to other downstream classifiers definitely is a thing