Viewing a single comment thread. View all comments

Monte924 t1_ja1q78m wrote

Actually thinking about it, it might not be a moral panic thing. It could be that if people use midjourney for pornography, that will play a roll in its training that could screw up the searches for others. Its like using google search with safe search turned off; even a perfectly innocent search can still result in NSFW results. So they exclude the porn searching while the AI learns so that it doesn't pick up bad habits while its in the early stages of its training

Also the company might also not want their AI to get associated with porn. That's just not good for PR

6

keylimedragon t1_ja4p2l5 wrote

Models aren't trained based on what people search though, it's usually done beforehand with a large set of training data (chatGPT might also use the thumbs up/down though for training). It's more likely they're trying to avoid liability and controversy.

2