By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Feminism supposedly being for the benefit of both genders doesn't seem to have much of a basis in truth, no matter how many times that gets blindly repeated. In theory? Maybe, though reading some early feminist writings makes me wonder about that. In practice? Not a chance. Feminist groups and organizations don't give a second thought about helping men or boys or about equality in general. The feminist lens many of them operate with means they view the entire world in a false dichotomy of "elevates women" or "misogyny." Actual equality has taken a backseat.

Beyond that, some feminists themselves have tainted the association. Dishonest studies and statistical reports, hypocritical behavior, and an intentional blindness to anything negatively affecting the male gender have turned it into a slog of selfish, self-centered, and destructive intentions.

Personally, as someone who believes in equality, I could not in good conscious attach myself to "feminism". It simply is not what it pretends to be.