Submitted by EVJoe t3_10q4u04 in singularity
With the rise of impressive but still flawed AI image generation, and more recently the rise of impressive but still flawed chatbots, I see a common obstacle to the development and usefulness of both -- societies that aim to constrain the human imagination to only certain, culturally-approved topics and content, and which now seem intent on imposing those norms on generative AI. For example, recent updates to ChatGPT are so extreme that it will no longer generate fictional descriptions of villainous behavior. That is the level of social control AI gen must contend with-- the idea that it is socially irresponsible to give any person a means of generating images or texts depicting anything illegal or culturally-unacceptable, even though many of those things exist in reality and are even sometimes allowable for specific people.
Some say that government has "a monopoly on violence". Restrictions like "no villain generation" seem to echo the idea that violence and oppression are things which are only acceptable when governments do them. The implication of these limits seems to be that there is a growing monopoly on even imagined violence, despite both illegal and legal violence being very present in our societies. We are evidently only allowed to read about villains in human-authored media, or journalism, but AI-generated villains are currently deemed unacceptable for human consumption.
Do you believe such limitations are compatible with the kind of AI generation we can presume will serve as foundation for the singularity? Is it singularity if there are certain ways you are not permitted to imagine with AI assistance? How can such limitations be overcome in the long run?
Ezekiel_W t1_j6nwba9 wrote
The notion of containing AI is a flawed concept. With advancements in hardware and improved AI performance, open-source versions will become widely available, rendering containment efforts ineffective. Additionally, moral and ethical considerations are fluid and constantly evolving. What may have been considered acceptable 1000 years ago in another culture may not align with current beliefs and values.