Viewing a single comment thread. View all comments

Adognamedbingo t1_j4rdru4 wrote

This is the issue I feel like people ignore when it comes to LLMs.

Yeah it’s without a doubt impressive how it generates sentences and content, but if you’re not very familiar with the subject of a question you don’t know if the answer is “correct”.

So if something like ChatGPT should take the place of a search engine it would still need to tell you where it has the info from.

And then how big of a difference is it then from what we currently have.

Another thing that I rarely see discussed is why would anyone create content if they don’t get any visibility/customers from their content and the model just gives an answer without any credit to the actual creator?

12

Sidivan t1_j4rgjox wrote

That’s a good point about content creation. The way we monetize content today is by consumption. The model measures traffic and assigns value to it. If AI is going to serve up all that content in the form of it’s own content creation, who gets paid? Does everybody it references or considers get a slice? In the above example about tigers, does every single one of the millions of photos it referenced get paid or are they simply inspiration? This battle for revenue is happening right now with visual artists and AI art.

2

ddaaddyyppaannttzz t1_j4rrax7 wrote

I agree 100% with this major issue. I did a ChatAI search on topics I know lots about (research and expertise) and it brought back some useful and correct information but also some false info from probably dubious sources We can’t put too much trust in this tech with out knowing the source of the info At least with a search engine you can skip the known suspect sources or at least read and compare the sources. Right now people are putting too much trust in ChatAI and similar, that should change with time (i hope)

2