Viewing a single comment thread. View all comments

acutelychronicpanic OP t1_jefa2tg wrote

It is critical that AI development not be concentrated in the hands of only a few big players. Large corporations, military research labs, and authoritarian regimes will not pause their research, only hide it. There is too much on the table.

By enabling the distribution of the development of AI research, particularly with regards to alignment, we can ensure that AI will be more likely to serve everyone.

Concentrated development amplifies the risks of AI catastrophe by setting up a fragile system where, when AGI is developed, even a minor misalignment may be unfix-able because there are no counterbalancing forces.

Distributed development means that yes, there will be more instances of mistakes and misuse, but these will be more limited in scope and less likely to lead to total human extinction or subjugation by an AGI system that *almost* shares our views.

We may be some years off from real AGI now, which is why this is a critical time to ensure the distribution of the technology to prevent any single factions or actors from acquiring such a lead that they can set the terms of our future.

The above are my thoughts on the matter and do not represent the views of LAION (which I am not affiliated with), although there is overlap.

3