Hollywood is Sexist! – Infographic

It might seem like Hollywood is a good place for actresses, but statistics show otherwise. It is pretty clear that women are constantly being put down and discriminated against. After all, it is a male-dominated society that we live in. Women are often treated as eye candy. Hollywood is being sexist in all the ways possible – salaries, lead roles, etc. Have a look:

Hollywood is Sexist! - Infographic

Infographic by – Frame Your TV

Tags:
Follow Us onPinterest
+ +