JoshN1986 OP t1_iwqs1g5 wrote
Reply to comment by [deleted] in Ask research questions in plain language and get answers directly from the full text of research articles by JoshN1986
The excerpts are taken directly from research articles, they are not generated by AI. How does this "hand the reins of understanding to a computer"? How does surfacing excerpts and abstracts based on a search or a question worse than just returning titles?
Moreover, with each answer, there are citations so you can see how and why that article returned has been cited. We aim to increase critical thinking by surfacing conversations amongst papers, not just a list of citations we all treat equally.
[deleted] t1_iwqsif7 wrote
[deleted]
JoshN1986 OP t1_iwqt9da wrote
So it is safer to hide the information and only return titles? Do you view full-text search as harmful? Do you advocate for paywalls to keep the public out? Do you have the same issues with Google Scholar that shows excerpts? These excerpts are not standalone, they are linked directly to the full-text articles.
Seems as if you are portraying this as something that is not. It's a discovery engine where you can search using boolean operators or ask questions. The results are peer-reviewed articled titles, excerpts, and abstracts. I fundamentally disagree that that is somehow harmful to understanding or critical thinking. It makes research more accessible.
jaam01 t1_iwrbp34 wrote
>When you hand the reins of your understanding to a computer, you forfeit your ability to make sense of the world and surrender it to whoever programmed the tool you’re using.
This concern is overflow. That's exactly what we are doing by using search engines on the first place. Unless open source, we don't know how the algorithm works, we don't know what to they prioritize or censor. Google even offered to do a censored search engine for China, proyect firefly, we don't know if they don't use a similar type of technology in their main search engines. That logic also applies with the people behind the result that the engine provides. We don't know their motives, biases, conflict of interests, unless you do a background check of every author of everything you read. Trust in scientists are in an all time low because now is easier to find all the BS scientists had said which put into question their credibility. I lost all respect for Neil deGrasse Tyson after all the disingenuous stuff he publishes in his Twitter, because he thinks he's an expert in everything.
Viewing a single comment thread. View all comments