The U.S. has and still is cooperating with various military dictatorships around the world. Obviously we would prefer to see them democratized, but we are doing it because we have national interests, whether it's working with Pakistan on Afghanistan or whatever.
Fascism and communism have not entirely disappeared but have been sidelined certainly, and liberal democracy has come to be accepted, in theory at least, around the world, if not always in practice.
In 1920, the West ruled huge amounts of the world.
Well, I think the United States first of all has to recognize the world for what it is.
I think we've seen at least the beginnings of rather significant social and economic change in the Muslim world, which I think will in due course lead to more political change.