The allegations that have surfaced within the last week or two regarding Hollywood executive Harvey Weinstein are deplorable to say the least. I visited Hollywood a couple of years ago and I can attest to the overly friendly nature of the people that live there, it’s the place where everyone goes in the hope of making their dreams come true, so every conversation you have whether it’s in a dive bar or a lock-in at the comedy club, there’s an air of “you never know” one conversation could literally change your life. So in a way I understand why these allegations have only come to light recently, no-one wanted to ruin their relationships or potentially derail the trajectory of their career by accusing one of the most powerful figures in Hollywood of sexual misconduct, because as the first public accuser proved; no-one will probably believe you. But with that being said, in the aftermath of the original allegations bonafide Hollywood stars such as Gwyneth Paltrow and Angelina Jolie have also come out and said they experience harassment at the hands of Harvey Weinstein and judging by the slew of people coming it you get the impression it was a well known in the industry that this guy like to prey on women. Then on the flip side you have other celebrities coming out and saying how dumbfounded they are that he could do such a thing, seemingly these celebrities were not privy to the industry gossip or they’re simply refusing to kick the man whilst he’s down, either way I don’t buy it. The culture needs to change and everyone is to blame.
It remains to be seen what kind of punishment if any he’ll receive, but that aside collectively Hollywood needs to remove their blinders and take a long hard look in the mirror and ask why they allowed this to go on for so long.