I think sexism is good and necessary. By sexism I don't mean not taking women seriously or not giving them rights. I mean that kind of sexism that makes people ask a man for help with carrying something and a woman for help with watching children. Or women dancing sexily as background for a sports festival instead of men. Things that don't hurt anyone but that people nowadays get butthurt about anyways. Why do I think like that? Because I think there's always a reason why stereotypes exist. Men are physically stronger. Women have other biological traits that men don't have. Stereotypes are more often true than untrue, otherwise they wouldn't exist (not in every case, but in most). If an individual doesn't meet the stereotype, then people should be able to notice and acknowledge that, and yes society should learn to do that - but not to get rid of stereotyoes at all. Just my opinion.