When Did Imperialism Start?

Imperialism in the United States began with the onset of the Spanish American war in 1898. Because people of wealth hold the majority of power in the U.S. many wonder if imperialism will ever end. You can find more information here: