Non-American here.
I see the US as a highly sexist country, especially when it comes to politics.
But I think Americans are generally more conservative when it comes to women’s issues on both of the US political spectrum.
I think, ironically, there’s also a lot more fear of women by men in the US.
I describe myself as a little dose of toxic masculinity.