NightlyPoe said:
What's led you to the conclusion that the United States has become more accepting of nudity? It's been my observation that the culture has actually gotten significantly less accepting of showing skin over the last decade, particularly about objectifying women's bodies. |
I agree that objectification of women is not as acceptable today as it was in times past. But, that's not the same as sex. Stuff is regularly shown on network TV today that would have been controversial on premium cable stations a couple decades ago.
In other words, I think you're conflating the issues.









