As an American we're taught every other country is ours to exploit and unless your american... well your just not worth the flesh you reside in...


While my own education about other countries was fairly broad, many people I know tell me where they went to school, they didn't really learn a lot about things beyond the borders of this country.

I don't know how widespread this problem is, but if it is, I shudder to think where foreign policy is going to go in the next generation, given where it is now.