Viewing a single comment thread. View all comments

anothercynic2112 t1_j28mq5g wrote

I don't really understand this question other than fitting a lot of reddit buzzwords into one post. Hollywood is capitalism through and through. It's simply marketing "art" for consumption. There's no value system outside of earnings, there are some self serving efforts at recognition which also only exist to use the talent to make more money for other parties.

The fact that some artists are part of this machine and some movies are objectively amazing works of art is just accidental side effect of the Hollywood system. Occasionally this system will help highlight and address some social ills, but in almost every case it's an ill that Hollywood has helped perpetuate.

Civil rights? Decades after Malcom X and MLK, we have a tiny percentage of non white dominated characters and movies. Women continue to be paid substantially less and while me too took Harvey down, the hundreds of enablers and minions who helped him are going about their business.

Does anyone think the casting couch is closed? Do we have anything resembling actual representatoon of people beyond the traditional pretty folks? Are actors still told to lose 5 pounds or else?

Hollywood is 90% a self serving beast meant to make money for investors, inflate egos and get some producers laid There's no ethics or value system. And without capitalism it doesn't exist. So yeah, they'll always paint themselves with an awesome brush.

1