Honestly it's still the ideal compared to the alternatives and that's not changing any time soon. Sure, there's plenty of media and various celeberties that would have you believe otherwise but what political party owns most of the media in the states? 
As a disclaimer, I'm not white nor christian but I don't believe everything on tv either. 








