andresni

andresni t1_j6jwwq5 wrote

If you look at how much energy and computational resources it took to make ChatGPT, then it's pretty obvious that even if ChatGPT4 (or whichever version) could in principle bootstrap itself into the intelligence stratosphere, it wouldn't have the resources to do so. Neither do we have that kind of resources hanging around unused for the AI to tap into without or explicit knowledge and consent.

And even if we dedicated resources to it, the next iteration would demand even more. The time it takes us to build the super computers, gather data, and provide the requisite energy, is measured in months if not years. A self-improving AI wouldn't be able to improve faster than we are able to allocate resources to it.

Unless, of course, it manages to tap into all our phones and computers and gaming consoles and serves and the like. That'll give it the juice it needs, perhaps. Question is, could it even do so? How smart would it have to be to do so without our collective consent and collaboration?

2