According to many liberal ideologues, men, especially white men, are the evil that plagues our entire planet. Thus, there is a worldview that masculinity is bad. And there is a move afoot to make men and women the same.
To me, this is very confusing. It’s never clear whether liberals want men and women to be the same or for women to be more like men. In TVs and movies, we see women all the time fighting men that are bigger and stronger than themselves and winning. Women are often the tough guys. We feed girls and women with a constant stream of entertainment that shows women fighting men.
Does this mean we are trying to get rid of masculinity so that women can exercise it?