Submitted by seraphaplaca2 t3_122fj05 in MachineLearning
TeH_Venom t1_je21d7u wrote
Not quite cross model architecture, but it's not impossible to merge different fine tunes of a model into one.
I personally have a few scripts for a few strategies such as
- Average merge;
- Diff merge;
- Block merging. (link)
I haven't tested diff merging or block merges too much (me and a friend finished adapting SD's block merge to LMs last week) but weighted average merges are a pretty safe way of mixing models.
Viewing a single comment thread. View all comments