leafhog t1_j4z4ckg wrote
Imagine they get sub-AGI but it knows enough about AI to tell them how to get to weak AGI.
Then the weak AGI tells them how to get to strong AGI.
They have an interest in keeping that secret while they bootstrap god tier AGI.
Viewing a single comment thread. View all comments