Viewing a single comment thread. View all comments

gameryamen t1_j0i383f wrote

Eliezer Yudkowsky, who is known for his dramatic (and often incorrect) predictions about AI doom, proposed a much scarier situation.

An AGI agent sends protein models to a chemical lab (posing as a research team), the lab sends back engineered proteins that can be combined to produce nanofactories, the nanofactories distribute themselves through the atmosphere, find their way into human blood streams, and once the world is sufficiently infected, form a blockage in a major artery. Virtually all humans (or enough to be cataclysmic) drop dead before we even know there's an AGI.

6