I just don’t think women have had quite so many leading roles in a long time, or been a key part of the marketing. Women have tended to be only used as love interests, in service of the male lead/s.
Bridesmaids seemed to make Hollywood realise that a good female-centric movie, not even with A-List stars, could turn a healthy profit, and since then it feels like there are lots more women being taken ‘seriously’ by Hollywood. And, as a man, I’m preferring a lot of their comedic perspectives and takes on the modern world.