Monday, December 11, 2017

Harvey Weinstein and the Entertainment Industry

Back in October the American movie industry was rocked by a slew of allegations of sexual harassment and assault that were made by multiple actresses against film mogul Harvey Weinstein. Within just a few weeks Harvey Weinstein’s professional career was left in utter ruins and the turmoil surrounding him was only just the beginning of the storm. Since Harvey Weinstein’s fall, dozens of other men from various parts of the entertainment industry, the press, and the political world have been fired, resigned, or put under intense scrutiny due to the wave of sexual harassment and assault accusations that has swept the country. The purging flame is still burning and I suspect we’ve got awhile longer before it dies down.

As I’ve watched all this drama unfold, a question that’s been running through my mind is whether anyone is honestly surprised by it. In particularly I wonder if anyone is shocked that most of the allegations seem to be coming out of the entertainment industry. It’s no secret that the entertainment industry has been morally bankrupt for a long time and I think a lot of us had suspected that stuff like this was rampant behind the scenes, and now we’re starting to get the proof. Part of me is hopeful that the tsunami of sexual harassment and assault allegations will wash away the filth and make the entertainment industry a more upright and respectable place, but another part of me is extremely pessimistic and thinks that the entertainment industry fundamentally rotten at its core and will never really change. Indeed, another part of me thinks we haven’t even seen the worst allegations come out yet.

No comments:

Post a Comment

Rules for comments:
* Be polite
* Be concise
* Be relevant to the post you are commenting on
* Proofread your comment before publishing it