Comments

You must log in or register to comment.

[deleted] t1_jddmfu7 wrote

[deleted]

14

Johns-schlong t1_jdf916o wrote

I would argue that's a tool people use to find information should give factual information. Imagine buying an encyclopedia and realizing you can't trust it.

1

LymelightTO t1_jddnifw wrote

Seems like the author had a conclusion in mind (“Write an article about how LLMs could be bad because they might misinform people”) and then tried to find the most sensational way to frame that conclusion.

Bing cited the source of the incorrect claim, so you could independently verify it, it doesn’t consistently seem to make the claim (presumably just due to how the technology works), and the claim isn’t even something I can’t imagine a human also thinking, if they had just Googled and skimmed the topic for keywords without much prior understanding of it.

This just seems like an updated variation of the “Wikipedia isn’t a good source” claim from the early 2000s. Like, it’s still largely a true claim, Wikipedia has lots of wrong information in it that seems very factual, but it’s also a very good tool for reference, if you use common sense and have some prior understanding of the subjects involved (and also have some good heuristics about which pages are likely to be more factual and updated than others).

Seems similar with LLMs. You have to have some prior intuition about what they are good at and bad at in order to make them more useful. Idiots are always going to find a way to hurt themselves with tools.

12

MystikGohan t1_jddw7xz wrote

I agree. Besides, I don't see Bard as the shining example of the future of LLMs. From what I've heard, it's already been heavily critiqued.

3

Old-Owl-139 t1_jde2oyo wrote

Just another attention seeking article from the verge. Shallow and predictable.

3