Human history from the very beginnings of civilization has been beset by the phenomenon known as empire. From the ancient Egyptians and Assyrians and Chinese, through the classical age of the Romans and Persians, the Islamic, Mongol and Holy Roman Empires of the middle ages, to the colonial empires of Britain, France, and Spain; no period of known civilized history has existed without some form of empire. What is it about empire that seems to make it such a predetermined part of human nature? Is it an inevitable process that will arise whenever one large group amasses an overwhelming share of power? Up until the 18th century it appeared to be the case – every major world power seemed to have imperialism as one of its primary motivating factors. Yet something happened in 1776 which would appear to call this assumption into question. The formation of the United States of America constituted a major shift in history. It was a nation seemingly formed on the basis of freedom, justice, and equality, and the preservation of basic human rights. This was something new; a nation whose supposed goal was not based in the material realm, but in the moral one. It was a nation which stood as a symbol against oppression and subjugation throughout the world. Surely the people of the United States had finally developed a system of government that fought against this imperialist mindset of domination and exploitation, and would usher in a new era of peace and equity in the world.
This way of thinking of the United States as inherently anti-imperialist has been prevalent throughout American history, and strongly persists to the present day. But is this in fact the case? A closer examination into the actual formation of the modern United States and its policies – both foreign and domestic – shows that in fact the opposite may be in fact true. Much of what America has done and has attempted to do, both supports, and contradicts the ideals upon which it was founded. In reality, since...