ATR2400 t1_j6g2rf0 wrote
Reply to comment by ikediggety in Google’s MusicLM is Astoundingly Good at Making AI-Generated Music, But They’re Not Releasing it Due to Copyright Concerns by Royal-Recognition493
How do you think it works? Honest question. If I ask an AI to make an image of a boat at sunset what steps do you think it follows to achieve a result?
ikediggety t1_j6gx4ef wrote
I think it pores through tons of images created by people and copies and combines aspects of several.
Without decades of work being done by humans, there's nothing to "train" the system on. It's imitation, not intelligence
ATR2400 t1_j6h5vgy wrote
That’s not how it works at all. Stable diffusion is trained on over two hundred terabytes of data yet it’s download takes up 4gb on my computer. How? Because it’s not just pulling images from some database and playing mix and match with their pieces.
Although comparison to human learning is not the best in this case it’s called “training” for a reason. The imagery it views is used to teach the AI to create its own imagery. If it bears a resemblance to someone else’s art style it isn’t because it’s ripping images from their deviantart page. It’s because a great deal of how it learned about imagery came from that person. It’s very loosely similar to how if during the process of learning to draw I browsed other peoples works to learn how images of certain things are assembled and used that to gain skill and knowledge but when I make my own art I don’t directly use those images in the creation process. If I learn a lot from a specific person my style may grow similar to theirs. Now I must stress that humans and machines are very different but it’s closer to that than it is to having the AI access some database of stolen images
And no. There’s no compression good enough to compress 250 terabytes into 4gb without making the data supremely useless. And it doesn’t connect to the internet. It works offline
ikediggety t1_j6jclch wrote
"Because it’s not just pulling images from some database and playing mix and match with their pieces."
It's playing mix and match with not just their pieces, but their characteristics. If that wasn't what it was doing, the database would not be required.
"It’s very loosely similar to how if during the process of learning to draw I browsed other peoples works to learn how images of certain things are assembled and used that to gain skill and knowledge but when I make my own art I don’t directly use those images in the creation process."
But you aren't a machine carrying out instructions with no choice. You aren't just an output-producing biological algorithm. The most important input for creation is the initial idea that it is necessary. When you sit down to make a painting, yes, you are employing techniques you may have learned from others, but you may also invent your own techniques that haven't been used before. Most importantly, you are the instigator of your own creative process - you are not making that painting because you are compelled to by outside forces you cannot control, you are making that painting because it occurred to you and you thought it was a good idea.
No machine, absent human input, has ever produced a painting, for the simple reason that no machine ever does anything absent human input. Machines simply carry out instructions given to them by humans. They are very good at that.
It's simply a calculation engine, and humans have done the hard work of figuring out how to use calculations to synthesize works of art.
Let me know when an AI, unprompted and with no input, asks a question of a human being. At that point I will call it intelligence. Until then, it's just a very advanced program processing input to produce output.
SnapcasterWizard t1_j6j0d4r wrote
>Without decades of work being done by humans, there's nothing to "train" the system on. It's imitation, not intelligence
If you raised a human in a dark room its whole life, do you think it could make art if you handed it a paintbrush and turned on the light?
ikediggety t1_j6j9tiw wrote
Well, somebody did. Somebody, somewhere, made a cave painting when nobody else had before.
Imitation is not creation. Advanced copying and pasting is not intelligence, it just looks that way if you squint real hard.
SnapcasterWizard t1_j6jfqh1 wrote
Yes and cave paintings don't look anything like the kind of art produced today. Art is learned and developed through imitation.
ikediggety t1_j6jjjxg wrote
It can be, but that's not the only avenue.
Crucially, major developments in art are frequently reactions against what came before, not simply reiterating it. Pointillism was unthinkable in the 1700s, for example, because nobody had thought to do it. The idea to do it didn't come from the desire to perfect the techniques of mannerism or baroque painting. It came from the idea to do something different.
Many major advances in human civilization come from a similar place of not accepting the rules. Machines, on the other hand, are literally incapable of not following the rules. They are large calculators. Rules is all they do.
Left unattended, a human will measure its environment and choose to take actions which will benefit it.
Left unattended, computers will rust, because it will never occur to them to do anything else, because nothing ever occurs to a computer. Computers don't have ideas.
SnapcasterWizard t1_j6jy87m wrote
>Left unattended, computers will rust, because it will never occur to them to do anything else, because nothing ever occurs to a computer. Computers don't have ideas.
Except if the computer is running a neural net, then yes, it actually can "come up with new ideas" thats the entire point of machine learning algorithms.
​
As for your previous paragraphs. In order to have a reaction against something, there must be a something there. New art styles and ideas build upon everything that comes before it, even if its a rejection of those ideas.
ikediggety t1_j6kbbtt wrote
But machines don't do that. AI will never invent a new genre of music, or a new style of painting. It can iterate and improve upon what it already knows. That's it.
All it's doing is running really fancy math that humans invented, that humans programmed into it, to analyze thousands of works made by humans, and spit out variations on it. It's a Netflix recommendation on steroids.
And no, computers don't "have ideas" because ideas are spontaneous. Computers produce output, and they do so because that's what human beings instruct them to do.
ETA: show me the AI algorithm that, when trained on centuries of baroque and mannerist paintings, invents impressionism. Show me the algorithm that, when trained on centuries of Bach and Haydn, invents jazz.
Viewing a single comment thread. View all comments