So someone left a two-week old Tribune lying in our resource room, and while I was having a bagel this morning I leafed through it and found an op-ed article about women in film and TV. The article posed a question which I find myself considering: why is that women in TV seem so much more empowered and independent than women in movies?
It's certainly not true in all cases and it's not like women on television are fully realized, but I look at the current movies and tv series I'm familiar with and I see the disparity.
So why? Why is it now becoming okay for women on tv to be genuine professionals, or funny without being insecure, or have problems and personalities that don't revolve around men-- while women in movies seem like they're still stuck in either the traditional rom-com snare or as flat, uninteresting, sexy ass-kickers?
Or if that dichotomy is false, why is that I seem to notice the ones on tv so much more?
It's certainly not true in all cases and it's not like women on television are fully realized, but I look at the current movies and tv series I'm familiar with and I see the disparity.
So why? Why is it now becoming okay for women on tv to be genuine professionals, or funny without being insecure, or have problems and personalities that don't revolve around men-- while women in movies seem like they're still stuck in either the traditional rom-com snare or as flat, uninteresting, sexy ass-kickers?
Or if that dichotomy is false, why is that I seem to notice the ones on tv so much more?