chocoduck t1_j4vmclx wrote
I am very pro AI but I think this is a really important case. These bots need to be controlled and having your content scraped should be on an opt-in basis. They really need to get their ducks in a row bc the bot does essentially reword your content and use YOUR WORK to generate results. That's not fair. They all need to be tightly regulated
broadenandbuild t1_j4vr2kd wrote
Good thing scraping isn’t illegal
leroy_hoffenfeffer t1_j4vv3of wrote
Scraping URLs is not illegal, no.
Taking those URLs and downloading the images on/in those URLs is a different story though. Collecting URLs is benign. Extracting information from those URLs may not be though.
broadenandbuild t1_j4w02gx wrote
It’s also not illegal to scrape the content of a site if said content is accessible to the public
leroy_hoffenfeffer t1_j4w0ixj wrote
We'll see how that plays out legally wirh respect to artwork.
Just because that's how things function currently doesn't mean it's going to end up being legal to do so in the future.
broadenandbuild t1_j4w3eyo wrote
You’re right about that. I’d think the internet ought to be treated like a public road, if things are intentionally made to be viewed in public than people are at liberty to record it. But, that may not be the case
chocoduck t1_j4wj0jt wrote
People are downvoting me but this is the fun content generating AI. Realistically, bots will scrape us and craft custom ads to fit your personality and likes.
TerrryBuckhart t1_j4vqlr9 wrote
But how? Aren’t these models open source?
It’s up to the user at the individual level to decide what the scrape or train. If they break the copyright law while doing that, it’s not the tool that’s responsible, it’s again the individual.
gay_manta_ray t1_j4wfhbv wrote
that isn't at all how any of this works. there is no database of images of stolen art that the model draws from when you generate a prompt. you're going to have to point out exactly where this stolen art is in their model, and good luck with that, lol.
Viewing a single comment thread. View all comments