Viewing a single comment thread. View all comments

UX-Edu t1_j1zajoc wrote

It reads like a synopsis, not a story. Like somebody wrote a book about a book.

I think AI is going to have a much harder time with words than with images. Humans are meaning-making machines. You show us an image and we’ll jam meaning on top of it. But words are trickier, and you don’t get meaning for free out of them.

It’s coherent, but it’s also pretty bad.

19

Warm-Enthusiasm-9534 t1_j1zdbce wrote

Honestly, ChatGPT is great with words. It's not as good as a professional author. I would say it's not even as good as me now, but it's better than me in high school and maybe even college. And this is without OpenAI really even trying (it's not like they went out of their way to only train on good writing and avoid bad writing). "Words" is an entirely solved problem.

There are some minor problems with it. It has no sense of style or voice, so things all tend to take the same tone. It's bad at jokes. It's also really lazy, in that when you tell it to do something it does it in the most lackadaisical fashion. Its answers tend to read like a college student who is doing homework in a class they don't really care about, but they're trying to bullshit their way through. But it could just be all of these could be fixed if you just give it the right instructions when you use it.

Where the current AIs break down is that they have no mental model of the world, and they don't do any long-term planning. They would never get how an event in chapter 3 foreshadows an event in chapter 23. It's these subtle elements of writing, rather than just words, that they're bad at. At least for now.

7