23235 t1_j5s30e5 wrote
Reply to comment by LoquaciousAntipodean in The 'alignment problem' is fundamentally an issue of human nature, not AI engineering. by LoquaciousAntipodean
One hopes.
LoquaciousAntipodean OP t1_j5s9pui wrote
As PTerry said, in his book Making Money, 'hope is the blessing and the curse of humanity'.
Our social intelligence evolves constantly in a homeostatic balance between hope and dread, between our dreams and our nightmares.
Like a sodium-potassium pump in a lipid bilayer, the constant cycling around a dynamic, homeostatic fulcrum generates the fundamental 'creative force' that drives the accreting complexity of evolution.
I think it's an emergent property of causality; evolution is 'driven', fundamentally, by simple entropy: the stacking up of causal interactions between fundamental particles of reality, that generates emergent complexity and 'randomness' within the phenomena of spacetime.
23235 t1_j5vj452 wrote
Perhaps.
Viewing a single comment thread. View all comments