Submitted by Mickeymousse1 t3_11stqn6 in Futurology
maskedpaki t1_jcffny7 wrote
this is literally not important at all
​
we have so many current issues and existential risks plus even if we start caring about the heat death 1 billion years from now then we would have wasted 0.000000000000000000000000000.........1% of our time. AI will kill us before 2100 stop worrying about far off stuff we never even reach.
fieryflamingfire t1_jcgeg5q wrote
If we start funding all grant money towards escaping heat death, then that's silly.
But if we're just talking about whether it's worth pondering or considering, that could be a motivator towards solving present problems (as the OP has suggested).
maskedpaki t1_jcghnzt wrote
How would it motivate anything ?
It just distracts from real issues that we are facing now. Like we REALLY could die before 2050 by AI
Stop wasting time on issues 10^100 years away
fieryflamingfire t1_jch6csw wrote
Idk, let's speculate on some reasons:
- It makes our current conflicts seem small or unimportant. A sense of "smallness" against the backdrop of the entire species or the entire universe. This seems similar in spirit to comments made by people like Carl Sagan or Neil DeGrasse Tyson when discussing the "largeness" of the universe and Earth's place in it.
- It gives us a common goal, which might drive social cohesion, which is a role religion and myth currently fulfill
- It makes us reflect on why we care about our own survival in the first place, and what the whole point of our existence is beyond our own survival as individuals
This is all speculation, but it's just as speculative as: this is going to distract us from facing real issues and bring us closer to our demise.
Viewing a single comment thread. View all comments