What is American Imperialism?

American Imperialism describes the belief that the USA is an empire. This belief follows the idea that the American military has set up bases throughout many other countries, and is currently involved in multiple policing actions/occupations.You can find more information here: