Hey, I'm not arguing the case very well. Forget I mentioned Disney -- I can rarely help having a subtle dig even when I'm trying to argue sensibly -- it's as much trying to deflate my own self-important sense of windbaggery as the other persons.

Moving on from that:

Just because we've created things that have an appeal abroad, we're imperialists?

To a certain extent, yes, albeit unwittingly.

everything I've been getting ... "the U.S. comes off as arrogant"

That's because it's true. I'm not saying that the US is arrogant. What I'm saying is that the US needs to take more care not to appear arrogant. The current administration is failing miserably, IMO.

dictionary.com has this:

"arrogant: Having or displaying a sense of overbearing self-worth or self-importance."

...which just about sums up -- to my mind -- current US foreign policy (overbearing).

Did Bill Clinton's administration also come off as arrogant across the pond?

No. Not particularly.

I didn't really want to get into an argument about "US imperialism" -- I will happily concede that traditional, territorial, imperialism doesn't describe the US. If anything, the US (government/populace) is traditionally isolationist.

So, in order to clarify/alter my standpoint about US imperialism, I had a quick poke around on Google for recent commentary. I found the following:

"The US is imperialist"



"No it's not"



The commentary that accuses the US of imperialism is reactionary, and rarely on point, but I'm trying to give you an idea of what you're up against.

I also found this: Why Don't They Like Us?

and this: The Myth of Cultural Imperialism, which (based on a brief skim -- I'll read it more fully later) is a fairly balanced discussion of the "cultural imperialism" that I was trying to talk about.

So, to summarise: Some people think that America is arrogant. It's a fact, whether it's wrong or right. See if you can change their impressions.
_________________________
-- roger