MysteryInc152 OP t1_jcpzgd4 wrote
Reply to comment by Temporary-Warning-34 in [R] ChatGLM-6B - an open source 6.2 billion parameter Eng/Chinese bilingual LLM trained on 1T tokens, supplemented by supervised fine-tuning, feedback bootstrap, and RLHF. Runs on consumer grade GPUs by MysteryInc152
Bootstrapping is basically taking a model's best/better outputs on a certain task and finetuning on that.
EDIT: Seems I'm wrong on that
MisterManuscript t1_jcry6cj wrote
That's not what bootstrapping is, it is a resampling technique used to create multiple datasets of the same size from the original dataset using random sampling with replacement. It is done to get the estimate of the standard deviation of a desired variable.
Here's the link to the ISLR textbook. The bootstrap chapter will verify what it is.
MysteryInc152 OP t1_jcrz16i wrote
Yeah I'm wrong it seems. Read a few articles using bootstrapping in the definition I used so I assumed that was generally it.
relevantmeemayhere t1_jcrotun wrote
Mm, not really.
Bootstrapping is used to determine the standard error of estimates using resampling. From here we can derive tools like confidence intervals, or other interval estimates.
Generally speaking you do not use the bootstrap to tweak the parameters of your model. You use cross validation to do so.
[deleted] t1_jcrvumf wrote
[deleted]
Viewing a single comment thread. View all comments