Viewing a single comment thread. View all comments

Gab1024 t1_je1njhv wrote

I like it, but would have been better if it was powered by GPT-4 and not GPT-3.5

32

maskedpaki t1_je23g0c wrote

we will get a gpt4 turbo sometime this year I think and 3.5 will be dead.

22

ArthurParkerhouse t1_je2y9t1 wrote

Unless the API cost for 3.5 is even lower than what it currently is.

3

maskedpaki t1_je2yji8 wrote

I dunno about that. I think at some quality starts to matter more than cost

I mean you can run gpt2 for free on your own gpu but no one cares to.

8

ArthurParkerhouse t1_je2z9te wrote

Depends on what's needed I suppose. For a general chat-bot it seems to work fine, plus we'll get the opportunity to fine-tune 3.5-Turbo models soon which will be enticing at that lower price point.

3

PolPotLover t1_je265vf wrote

really Sir? have you tried? how do you compare GPT-3.5 with GPT-4? i’m using it for a while now and seeing great results.

2

apinanaivot t1_je24cio wrote

What makes you think it isn't?

1

iJeff t1_je2n91k wrote

It definitely isn't. Not only is the icon green like with GPT-3.5, it replies exactly like GPT-3.5 to certain narrative-based commands.

2

apinanaivot t1_je4d58k wrote

Weird, GPT-4's biggest strength was said to be it's ability to use external tools proficiently unlike GPT-3.5.

2

Beowuwlf t1_je4p2yu wrote

3.5 was using tools (LangChain) for months before plugins were announced

1

iJeff t1_je66e93 wrote

I suspect they just don't have enough capacity yet. As it is, I find the responses a bit slow whenever it needs to access the internet. Combine that with the slow GPT-4 response times largely due to congestion and 25 messages per 3 hours limit and it wouldn't be a great experience.

1