Viewing a single comment thread. View all comments

ChronoPsyche t1_itflq78 wrote

Oh there are certainly workarounds! I agree 100%. These workarounds are just that though, workarounds. We won't be able to leverage the full power of long-form content generation until we solve the memory issues.

Which is fine. There is still so many more advances that can be made in the space of the current limitations we have.

2

visarga t1_itgqug0 wrote

There is also exponentially less long-form content than short form. The longer it gets, the fewer samples we have to train on.

1