By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Tough question.

From an outsider's point of view, it often seems like a crime-ridden place with arrogant people. More often than not, people in high places put the country as some sort of shining example of how everything should be and are massively ignorant about uses and customes of other nations and cultures. Every once in a while, someone will say some laughable, hypocritical or even patronizing, thing about some other country that could surely compete in the 'dumbest quote of millenium-award'. Like when some woman from the UN commented on how we celebrate our holidays here in the Netherlands. I'll tell you, a lot of people in high places aren't really promoting the image of your country much, you're getting laughed at instead. Or worse. Then, there's things like the school shootings and massive murder rates which makes people other western countries scratch their heads about why there's still such loose gun-policy in the US. Racism, which occurs in every 'camp', also seems to still be a big thing in the US as well.

I'm sure we only hear about the bad parts here though. Media is always biased and thrives on sensation; good news and lightheartedness is boring I guess.

I've been to New York for a week years ago, and I have to say it was one of the best vacations I ever had. The city is amazing and I felt right at home. Even for such a busy city, people seemed very friendly, helpful and respectful. It does get annoying that every American seems to think Dutch people speak German (saying things like "Auf wiedersehen!"), but at least they tried to be polite, which is more that can be said for most. On of the things we Dutch can take an example of, as we're often as a people occused to be blunt, which is usually correct, even if there's no wrong intent, because we're so down-to-Earth.

So I've seen New York and visited pretty much every neighbourhood so I got a good sense of the city. Besides that though, I haven't been anywhere else (except Philadelphia Airport) so my view of the country does remain mostly an outsider one. A couple friends and me were planning to tour the Western US next spring, but one of them had problems getting a visitor visa, so that will have to be shelved unfortunately.

Nature is very unique, which is why it's very interesting for foreigners, and there's a lot of beautiful things to see, but the cities I feel are a mixed bag from what I know. For a European, where every city has centuries or sometimes millenia of (visible) history, most US cities seem very uninteresting. Plain or 'ugly' (for the lack of a better word) mostly and soulless. A few exceptions of course, one of them being NYC which I visited and would love to revisit sooner rather than later, but here in Europe, there's too many amazing cities to visit to count.

Due to that, and the unsafe image along with representative people being arrogant, I wouldn't want to live there. Especially since I'm already in a country that has an equal, or higher, depending on what you're looking at, level of quality of living. It remains a good country to visit though, my parents have been in Florida to return with plenty of stories, and that tour of the Western US will still happen someday. As will revisiting New York. Can't wait!

Lastly, I do hope you Americans realize that your presidential elections and the hysteria that goes with it is just one big popcorn-show for the rest of us.