Viewing a single comment thread. View all comments

AbeWasHereAgain t1_jeat78b wrote

Go ask Vanilla Ice what happens when your music sounds a little too close to the original.

OpenAI, and Microsoft, are 100% violating terms of use for the vast majority of the stuff they scraped.

6

khamelean t1_jeautqo wrote

All musicians learn from hearing other music.

There is a difference between learning and copying.

−3

No_Character_8662 t1_jeaxen9 wrote

So if I call something in my process "learning" I'm free to use it? I'm learning copies of your works on my printer to sell right now

Edit: to be clear I don't know what the answer is but that seems simplistic

7

Numai_theOnlyOne t1_jeb9lah wrote

Tbh can we separate human learning with AI learning?

A human is a biological imperfect being that require time and repetition to learn.

AI needs just a large pool of data and can the same as millions of humans in a fracture of the time required.

I think that's not the same learning, and a thing that honestly should be questioned, after all our content was created with humans in mind and not meant to been used for ai.

5

khamelean t1_jeaztbb wrote

The learning isn’t the problem, the selling is.

−4

Newfondahloose t1_jeb49yx wrote

They are selling their own work. There’s only so many ways you can answer a question. Just because you’ve answered the question before, doesn’t mean someone else can’t come to the same conclusion when answering for themselves.

2

AbeWasHereAgain t1_jeaxmj7 wrote

lol - you don't think ChatGPT is spitting out insanely close replicas of other peoples work daily?

4

khamelean t1_jeazo62 wrote

Nothing wrong with playing/singing other people’s songs, I sing along to the radio in my car all the time.

1

AbeWasHereAgain t1_jeazzbi wrote

ha ha ha - yeah, totally the same thing. Just an FYI, artists are required to pay when they do a cover.

Everything changes when you start making money off other peoples work.

3

khamelean t1_jeb0muo wrote

That’s exactly my point.

1

AbeWasHereAgain t1_jeb0qkx wrote

What is your point?

2

khamelean t1_jeb2qq4 wrote

It’s not a problem until you start making money off other peoples work.

2

Space_Pirate_R t1_jeb6yrh wrote

Are monetized AI artists paying royalties to everyone whose art was scraped off the web?

1

khamelean t1_jebg7rh wrote

Are human artist paying royalties to everyone who’s art they scraped off the web??

1

Space_Pirate_R t1_jebi0au wrote

Human artists learning from others' work is obviously "fair use." I don't think a corporation will successfully deploy that in defense of training a commercial AI.

1

khamelean t1_jebj0yq wrote

Just looking at a piece of art is enough to encode it into a human’s neural network. Why should it be any different for an artificial neural network? If it’s free to access then it’s free to access.

1

Space_Pirate_R t1_jebovpk wrote

I don't believe that an artificial neural network is morally or legally equivalent to a human. If I did believe that, then there would be more pressing issues than copyright infringement to deal with, such as corporate enslavement of AIs.

0

khamelean t1_jebrxm4 wrote

What does moral or legal equivalence to humans have to do with anything?

The point is that all AI has to do to learn from art is look at it. If someone makes their art free to look at, then it’s free for an AI to look at.

0

Space_Pirate_R t1_jebta1s wrote

AIs don't have agency. The AI is a tool which is being operated by a corporate entity. The corporate entity is governed by existing laws, and requires a license to use a copyright work in the operation of their business.

0

khamelean t1_jebxbzr wrote

So companies have to pay a licensing fee to every artist who’s work that employees of that company have ever looked at?? Yeah, I don’t think that’s how it works.

0

Space_Pirate_R t1_jebzo2e wrote

No, because (as I mentioned earlier) there is a fair use exemption which allows humans to be educated using copyright works. However, there is no such exemption allowing corporations to train AI using copyright works.

0

khamelean t1_jec1fmg wrote

Education is irrelevant in this context. The copyrighted works people consume through education is a tiny fraction of the total number of copyrighted works that most people experience through their lives. And all of those experiences contribute to that person’s capabilities.

The exemption for education’s purposes is for presenting copyright material to students in an education setting. It has nothing to do with copyright work that the student might seek out themselves.

0

Space_Pirate_R t1_jec5lal wrote

Yes, humans experience copyright works and learn from them, and that's fair use. What does that have to do with training an AI?

A person or corporation training an AI is covered by normal copyright law, which requires a license to use the work.

1

khamelean t1_jec837g wrote

How is it any different to an employee “using” the work? Corporations don’t pay licensing when an employee gets inspired by a movie they saw last night.

Why do you keep mentioning corporations? An AI could just as easily be trained by an individual. I’ve written and trained a few myself.

1

Space_Pirate_R t1_jec8j1p wrote

>Corporations don’t pay licensing when an employee gets inspired by a movie they saw last night.

The employee themselves paid to view the movie. The copyright owner set the amount of compensation knowing that the employee could retain and use the knowledge gained. No more compensation is due. This is nothing like a person or corporate entity using unlicensed copyright works to train an AI.

>Why do you keep mentioning corporations? An AI could just as easily be trained by an individual. I’ve written and trained a few myself.

Me too. I keep saying "person or corporation training an AI" to remind us that the law (and any moral judgement) applies to the person or corporate entity conducting the training, not to the AI per se, because the AI is merely a tool and is without agency of its own.

1

khamelean t1_jecbi7y wrote

“What does that have to do with a person or corporate entity training an ai?”

Training a human neural network is analogous to training an artificial neural network.

Whether the employee paid to watch a movie doesn’t matter, they could have just as easily watch something distributed for free. The transaction to consume the content is, as you said irrelevant to the corporation.

An AI consuming a copyright work is no different to a human consuming a copyright work. If that work is provided for free consumption, why would the owner of the AI have to pay for the AI to consume it?

1

Space_Pirate_R t1_jecfcfy wrote

>Training a human neural network is analogous to training an artificial neural network.

By definition, something analogous is similar but not the same. Lots of things are analogous to others, but that doesn't even remotely imply that they should be governed by the same laws and morality.

>An AI consuming a copyright work is no different to a human consuming a copyright work.

A human consuming food is no different to a dog consuming food. Yet we have vastly different laws governing human food compared to dog food. Dogs and AI are not humans, and that is the difference.

>If that work is provided for free consumption, why would the owner of the AI have to pay for the AI to consume it?

If that work is provided for free consumption, why would the owner of a building have to compensate the copyright owner to print a large high quality copy and hang it on a public wall in the lobby? The answer is that the person (not the AI) is deriving some benefit (beyond fair use) from their use of the copyrighted work, and therefore the copyright owner should be compensated.

1

khamelean t1_jecru6d wrote

The building owner is using a replication of the copyrighted work. The owner should absolutely compensate the original creator.

But the printing company that the building owner hires to print the poster doesn’t owe the original creator anything. Even though it is directly replicating copyrighted work, and certainly benefiting from doing so. If the printer were selling the copyrighted works directly then that would be a different matter and they would have to compensate the original copyright owner. So clearly context matters.

An AI doesn’t even make a replication of the original work as part of its training process.

If the AI then goes on to create a replication, or a new work that is similar enough to the original that copyright applied, and intended to use the work in a context where copyright would apply, then absolutely. That would constitute a breach of copyright.

It is the work itself that is copyrighted, not the knowledge/ability to create the work. It’s the knowledge of how to create the work which is encoded in the neural network.

Lots of people benefits from freely distributed content. Simply benefiting from it is not enough to justify requiring a license fee.

Hypothetically speaking, let’s say a few years down the line we have robot servants. I have a robotic care giver that assists me with mobility. Much as I may have a human care giver today.

If I go to the movies with my robot care giver, they will take up a seat so I would expect to pay for a ticket, just as I would for a human care giver. Do I then need to pay an extra licensing fee for the robots AI brain to actually watch the movie?

What if it’s a free screening? Should I still have to pay for the robot brain to “use” the movie?

Is the robot “using” the movie in some unique and distinct way compared to how I would be “using” the movie?

1

Newfondahloose t1_jeb3tmh wrote

It’s learning and using language to answer questions. There’s only so many ways you can answer the same question. Greed getting in the way of progress, as always. Guess professors should give a citation every time they give a verbal answer even though they are answering from memory.

0

Particular-Way-8669 t1_jebcdng wrote

There is difference between human that can be creative and using it for computer program that creates aggregations. Completely different thing. AI does not really learn. It adjusts its mathematical functions based on data.

1

khamelean t1_jebgej7 wrote

No, there is no difference. Creativity is just combination and random mutation. It’s how humans are creative, it’s how machines are creative. It’s the same thing.

1

Particular-Way-8669 t1_jebh4n3 wrote

This is utter bullshit. There was always some human that came up with something first. When there was nothing like that before. AI technology we know does not have this ability. And never will. It is only data aggregation, nothing else. Human does not need data from other humans to be creative and the very fact that there was someone who climbed off of trees and picked up first fire is proof of that.

1

khamelean t1_jebi03n wrote

Combination + mutation. It allowed evolution through natural selection to give us every life form on earth. Creativity works exactly the same way.

1

johndburger t1_jeby23l wrote

ChatGPT has “learned” some generalizations from the text that it’s processed, but it has also literally memorized (I.e. copied) billions of words from it.

1