The primary job for women in Hollywood is still super-attractive actress. That is the most high-profile women's job in Hollywood.
Hollywood is a perpetual summerland, a temperate, godless yaw where the very word 'season' has been co-opted by television executives. There are few harbingers of winter here.
I think it's pretty obvious that women's stories are not necessarily being told in Hollywood and women are not necessarily being put in the leadership positions they deserve in mainstream film.
I appreciate the positivity of those 'year of the woman' articles - it's good to get that energy out there - but at the same time, in Hollywood it's not happening yet.