Submitted by ratatouille_artist t3_y0qra7 in MachineLearning
Ulfgardleo t1_irv9w00 wrote
Reply to comment by gradientrun in [D] What are your thoughts about weak supervision? by ratatouille_artist
quite easy to proof.
take a multi-class classification problem. Now, pick one class and assign it label 0, assign all other classes the same coarse label 1 and try to find the maximum margin classifier. This problem is equivalent to finding a convex polytope that separates class 0 from class 1 with maximum margin. This is an NP-hard problem. Logistic regression is not much better, but more difficult to proof.
This is already NP-complete when the coarse label encompasses two classes: https://proceedings.neurips.cc/paper/2018/file/22b1f2e0983160db6f7bb9f62f4dbb39-Paper.pdf
ratatouille_artist OP t1_irvdebw wrote
Very interesting perspective around the difficulty of learning weak labels. If I have time would be good to do a longer form write up around how effective skweak is for span extraction with it's hidden markov model approach for span extraction.
Viewing a single comment thread. View all comments