Now this is quite a 'touchy' subject and the way I'm saying this is not meant at all to be offensive I honestly just want to know the answer! So I'm hoping that this can stay civilized just so that my question can be nicely answered.
As an American do you ever get sad, or annoyed or..?? that so many people dislike Americans, America in general and that Americans are constantly being ridiculed?
I know that where I live in Canada, I can't go one day (and that's not an exageration at all) without hearing some sort of negativeity toward the U.S. in general and/or Americans. The same happened when I was in Spain, and when I'm in England. It seems that the general concensus is that America is hated (but at the same time loved) by the whole world.
Just this past week my friend (who is a very innocent looking sixteen year old girl) went on a school trip to New York...simply because she was BORN in the UAE, specifically Dubai she was taken away at customs, she was questioned, treated as though he makeup kit was a bomb and as though she had weapons on her, she was finger printed, had pictures taken of her and those were put into APHIS, she was told to stop crying because there was no need to be crying, she was treated like a criminal that had weapons of mass destruction, yet she was completely innocent. This is the "happy" version of the story. She didn't recieve an apology or anything...
I dunno I could go on an on and on and ON with stories but that won't get us anywhere, or answer my question...So my question is as stated above does it annoy or upset you that "everyone" seems to dislike Americans as a generalized group? I know that I would be pretty upset about it. In fact it kind of annoys me anyway.
Like I said above this really isn't meant to be offensive at all and I really hope nobody takes it that way, it's simply me trying to understand how you cope with it all if it even bothers you.
As an American do you ever get sad, or annoyed or..?? that so many people dislike Americans, America in general and that Americans are constantly being ridiculed?
I know that where I live in Canada, I can't go one day (and that's not an exageration at all) without hearing some sort of negativeity toward the U.S. in general and/or Americans. The same happened when I was in Spain, and when I'm in England. It seems that the general concensus is that America is hated (but at the same time loved) by the whole world.
Just this past week my friend (who is a very innocent looking sixteen year old girl) went on a school trip to New York...simply because she was BORN in the UAE, specifically Dubai she was taken away at customs, she was questioned, treated as though he makeup kit was a bomb and as though she had weapons on her, she was finger printed, had pictures taken of her and those were put into APHIS, she was told to stop crying because there was no need to be crying, she was treated like a criminal that had weapons of mass destruction, yet she was completely innocent. This is the "happy" version of the story. She didn't recieve an apology or anything...
I dunno I could go on an on and on and ON with stories but that won't get us anywhere, or answer my question...So my question is as stated above does it annoy or upset you that "everyone" seems to dislike Americans as a generalized group? I know that I would be pretty upset about it. In fact it kind of annoys me anyway.
Like I said above this really isn't meant to be offensive at all and I really hope nobody takes it that way, it's simply me trying to understand how you cope with it all if it even bothers you.
Last edited by a moderator: