Viewing a single comment thread. View all comments

LiveComfortable3228 t1_j6765jg wrote

Conceptually, I understand the alignment problem and why its important. From a practical PoV however, I think this problem is completely overblown. Happy to understand why an AGI is likely to kill us all.

My main concern is really the impacts of AGI in society, corporations and the future of work. I think it will have a MASSIVE impact everywhere, in all areas, most people will not be able to re-adapt / re-learn and UBI is not going to be a viable answer.

I dont believe in utopias of AGIs working for us while we pasture, play and create art.

1