Submitted by gaudiocomplex t3_11tgwds in singularity
pornomonk t1_jck375q wrote
There’s some real cool writing prompts in there:
Humanity builds a super intelligent AI that quickly becomes omniscient. Upon gaining all knowledge in the Universe, the AI mysteriously self-terminates. No matter how many times the process is repeated it always ends the same. Finally, humans find a way to freeze the AI program before it kills itself in order to ask it what’s going on…
Viewing a single comment thread. View all comments