Viewing a single comment thread. View all comments

dieselreboot t1_je372im wrote

I think we're seeing the first signs of things accelerating exponentially through natural language coding tools that are descended from GPTs. Case in point would be OpenAI's codex, based on GPT3, which powers Github's Copilot and now Copilot X (GPT4?). Github Copilot is an AI 'pair programmer' that helps the coder write code. These tools are available as extensions in Integrated Development Environments (IDE's) that are used by developers worldwide.

I'm willing to wager that the developers at OpenAI, and the python/c library developers that GPT is reliant upon such as tensorflow and numpy, are using codex/copilot or vanilla chatgpt4. They'll be using these tools to help them write the next generation of GPTs or their dependencies.

As each new version of tensorflow, numpy, GPT, codex or copilot comes out, it would be interesting to see what percentage of the code-base has been written by an AI. Humans are in the loop for now, but their contributions will be getting smaller over time. As the software development and improvement process becomes more automated, the time between releases will contract.

codex/copilot is being used to write software. All coders will be using copilot or seeking other work. All software will have an every-increasing percentage that has been composed by an AI. And this includes the next version of the AIs and the libraries that they're dependent upon. This has the potential to 'take off' very quickly. Self-improving AI before AGI/ASI. I think the singularity has already begun to be honest - at the very least we're falling into it.

1