Viewing a single comment thread. View all comments

ortegaalfredo OP t1_jb8kdzj wrote

Here are the instructions, you need a discord account, that's it. No limits on what you can ask it, nor rules. Please behave as any spam will need to be removed:

https://twitter.com/ortegaalfredo/status/1632903130416308229

Code for the bot is here:

https://github.com/ortegaalfredo/celery-ai/blob/main/discord/bot.py

12

ReginaldIII t1_jb9goco wrote

Link to your code? It needs to be GPLv3 to be compliant with LLama's licensing.

How are you finding the quality of the output? I've had a little play around with the model but wasn't overly impressed. That said, a nice big parameter set like this is a nice test bed for looking at things like pruning methods.

−4

abnormal_human t1_jb9kyzr wrote

Actually, it doesn't. GPLv3 just requires that if OP distributes a binary to someone, the source used to produce that binary is also made available. With server side code the binary isn't being distributed, so no obligation to distribute source.

13

ReginaldIII t1_jb9xlil wrote

Fair enough, I didn't realize that hosting a publicly available service is not the same as distributing.

3

ortegaalfredo OP t1_jbi81mn wrote

I posted the github repo in the original post. The output is bad because Meta's original generator is quite bad. I upgraded it today and its much better now. Still not chatgpt.

1