smokebomb_exe t1_jc7f38x wrote
Reply to comment by Iggitron90 in What can a ChatGPT developed by a well-funded intelligence agency such as the NSA be used for? Should we be concerned? by yoaviram
Passively listening to, of course. Otherwise, government agencies have very little need to sew division for whatever nefarious plots they may have since Americans are dividing themselves. Mention the word "drag queen" to a Republican or "AR-15" to a Liberal and watch cities burn and Capitals fall.
MEMENARDO_DANK_VINCI t1_jc7pvzn wrote
Well in America you’re right but they’re probably doing similar things in Russia and China
CocoDaPuf t1_jcafic7 wrote
>Americans are dividing themselves
That's where you're wrong, Americans were not dividing themselves this much until nations started directly influencing the public conversation.
Edit: I also don't want to imply that I think American agencies aren't conducting their own AI driven disinformation and "public sentiment shaping" campaigns. That's certainly a thing that is happening. If anything the US has a larger incentive to use AI for that, as here it would be much harder to keep the kind of programs China and Russia use under wraps, the "troll farms" which are like huge call centers for spreading misinformation, anger and doubt.
sinsaint t1_jcahoj3 wrote
And until republicans started scraping for votes by turning the uneducated into a cult using meme-worthy propaganda.
Drag queens, hating responsibilities, and prejudice against anything a rep thinks is 'woke'. It'd be comical if it wasn't so effective.
And before it was that, it was Trump telling everyone a bunch of lies they wanted to hear, all while using the presidency to advertise his buddy's canned beans in the Oval Office.
Other countries didn't make us crazy, the crazies just didn't know who to vote for before.
CocoDaPuf t1_jcawtyo wrote
Absolutely, right on all counts.
smokebomb_exe t1_jcaithr wrote
Correct, the divisive political memes from Russia and China I mentioned in a reply here
Viewing a single comment thread. View all comments