Viewing a single comment thread. View all comments

Ginkotree48 t1_j309um8 wrote

Im so scared and anyone outside of this sub laughs at these concepts. I really want all of us to be fucking crazy and wrong and overly anxious. And me considering im just crazy and anxious and wrong as a last grip on my sanity is horrible. Because if we are right we all know what that means we just dont know when it will happen. But we know its going to be very soon.

I dont know what to do. Im starting to legitimately consider quitting my job in maybe a year. Im 24. I just want to have a good time while i still can. And if my anxiety or all this is annoying because you are very optimistic about AI just know im actually scared and this is my only outlet because like I said nobody outside this sub can be talked to about this.

9

sideways OP t1_j30azdq wrote

I can definitely appreciate your feelings. You are not crazy (I mean... probably not but what do I know?)

The thing is, you have to ask yourself if there is anything constructive you can do, in light of all these accelerating developments in AI, to improve either your life or the world. If there is, then do that.

If there isn't, then the right thing to do is carry on with life as normal. Quit your job because you hate your job not because of the Singularity. Nobody knows what's going to happen so you need to live your life based on the inherent value of each day not based on some expected future condition.

Hang in there!

5

Ginkotree48 t1_j30c9qz wrote

Thank you that really means a lot to me.

I think I commented subconsciously to get people hating saying im wrong and an idiot so that I feel like we have more time. Your response made me feel much better than that.

I hope you have good luck doing the same you suggested to me.

3

marvinthedog t1_j313jue wrote

I definately share your concern. I feel like a doomsdaynutter. I can´t talk to anybody about it, not even my own family. If I talk to anyone the risk is actually that I might convince them. Well I did bring it up briefly with my co worker over a beer and he was actually very open to the possibility. But he is convinced that we will be "more or less" doomed by global warming on a longer timeline, so it felt right to bring it up.

3

Ginkotree48 t1_j3146cs wrote

Yes!

I have thought many times despite my inherent drive to share my concerns that I may potentially make someone else scared like me. So its such a fucked position to be in. It feels like nobody will believe me but even if they genuinely do they are just terrified like me.

Because it feels so daunting and that its going to happen and nobody can stop it. It feels like we know a meteor is going to hit sometime between the end of this year and 10 years from now. Oh god I really just hope it kills us painlessly but I really doubt it. I wonder if killing ourselves would be better. I also dont have any idea how we would even know it was happening until it did. Since it would probably have to kill us all at once or very quickly.

2

marvinthedog t1_j317tg3 wrote

I do think it will be quite painless because that´s what experts on this scenario seem to think. I am more worried about the increasingly turbulent time in society leading up to this point. I just want to avoid stress and have a good time. One other big problem is that I am to cought up in other stressfull (but comparatively minor) things in my life right now when i should be focusing on being happy instead.

I wouldn´t say I have actual anxiety about AI doom, yet. One thing that I think has helped me to avoid this anxiety is that I have done extensive philosophizing about "the teleportation dilemma" which has caused me to view the concept of death completely differently.

In a way, I almost worry more about the overall level of conscious happiness throughout all of time and space throughout all dimensions/simulations/realities because that is the ONLY thing that ultimately matters in the end. This got deep, but this philosophy helps me cope with impending doom.

2

visarga t1_j36i9o1 wrote

> I almost worry more about the overall level of conscious happiness throughout all of time and space throughout all dimensions/simulations/realities because that is the ONLY thing that ultimately matters in the end

This doesn't make sense from an evolutionary point of view. There's no big brotherhood of conscious entities, it's competition for resources.

2

Ginkotree48 t1_j31i9qh wrote

Yeah its scary to see ourselves dive into philisophical and spiritual stuff because of our worries. But they are comforting for a reason.

I have wondered many times if this is just a simulation of everything leading up to its creation run by it when it wants to learn exactly what happened before it existed. And when its created it ends this simulation.

Idk bunch of wierd crazy thoughts. I have struggled for years belieiving this is the base reality when the creation of just one simulated reality reduces the odds this one is base by 50%.

1