Submitted by pipe2057 t3_10p4pkn in singularity
TheDavidMichaels t1_j6mr4oa wrote
Reply to comment by TheDavidMichaels in If we achieve AGI in the next ten years, and if we achieve the singularity in the next ten years, will there be an option to entering a hive mind with people who we only know? Also when we achieve AGI and singularity, will there be options to control or modify our mental health(anxiety, depression? by pipe2057
People need to stop thinking that AGI will make them better. Ask yourself, when a child or someone with vastly different knowledge tells you how great coloring with markers is, how much effort or brain power do you dedicate to this clearly trivial task? Zero or close to it, right? If AGI is vastly more intelligent, is it going to help you? NO! It will control you or dispose of you. I have never heard a positive outcome articulated by anyone working on these ideas. I have never heard anything remotely good for humanity. Just like this post, it's all just wishful thinking about how something can do all the work and make me perfect while never actually doing anything difficult. That's the appeal of the AGI fan club. I get to be lazy forever. Yeah!
Viewing a single comment thread. View all comments