I keep seeing complaints that women are acting like men these days, women want to be men, women don't act feminine anymore. Let's clear this up.
Most women do not, actually, want to be men.
What they want is to decide their own destiny, follow their own dreams, control their own bodies, and earn respect and fair wages for their work.
It's just that up till now that was all exclusively available to men, so you can sort of see how people can get confused.