Submitted by dustofoblivion123 t3_1194caa in Futurology
Aspirin_Dispenser t1_j9o4w9j wrote
Reply to comment by mcphilclan in Google case at Supreme Court risks upending the internet as we know it by dustofoblivion123
The same way they functioned before the content served to users on their sites was algorithmically amplified or suppressed. It wasn’t that long ago that Twitter and Facebook were using chronological feeds where content was served purely in the order it was posted. You kept pornography out of your feed simply by not following users that posted it. Back then, the argument that these sites were simply repositories of information posted by users was legitimate because the companies that ran them were doing little more than just serving it to users as it came in with ads interspersed throughout to generate revenue. Now, with social media sites choosing to amplify or suppress content, that argument doesn’t hold water. The content is now curated and editorialized in much the same way that a newspaper or book publisher curates their content. If they want to do that, that’s fine, but they need to be held to same legal standards as any other publisher. Or, they need to stop acting like a publisher and start acting like the simple “repositories of information” they claim to be. They can’t have all the financial benefits of providing curated content along with all the lax legal standards of providing un-curated content.
If being held to those standards means that their business model no longer works, then oh well. But the truth of the matter is that it does work, it just generates less money.
Viewing a single comment thread. View all comments