Hollywood after WeinsteinBy Divya G
In 2020 Hollywood, every segment has made diversity with white people do not automatically fill the job roles, and the HR departments at the production houses are now more vigilant and responsive when complaints of misbehavior and misconduct are filed.
It’s been almost three years since the Hollywood mogul was put on trial for sexual allegations, and certain things have definitely changed in the entertainment industry. Nevertheless, Hollywood is known for doing things in a specific way ever since the industry surfaced. So, not every aspect of the film industry has been quick to change.
Even after Weinstein’s trial, Hollywood still remains a man’s world. Take the Oscars, for example. It has been about a decade now that the Academy of Motion Picture Arts & Sciences failed to recognize women for the best director category.
If you look, the picture is quite clear. Only one out of twenty acting nominations was given to a person of color. In addition to that, the top films honored at the Oscars featured all-white actors directed by celebrated white filmmakers.
A lot has yet to change in Hollywood, but some improvements can be seen lately. Films centered on the female lead are tipped to prosper with Gal Gadot’s Wonder Woman making a big difference among the actresses. And the best part is that the movie is directed by a female director.
Now the only question is till when it will continue to go like this. There are many independent artists and directors who are looking to make a way in the entertainment industry.