Viewing a single comment thread. View all comments

judasblue t1_j1akvas wrote

Oh, I was just pointing out that 1000 tokens in their base model for other services is 0.0004, so an order of mag lower than u/coolbreeze770 was guessing. In other words, pretty friggin cheap for most since a rough way to think about it is three tokens equaling two words on average.

edited for clunky wording

3

f10101 t1_j1cmsob wrote

Just in case you miss my other comment - chatgpt seems to actually be particularly expensive to run in comparison to their other apis. Altman says "single digit cents per chat".

1