What Happened in the Westward Movement?

The westward movement in the United States of America refers to what historians call ‘Manifest Destiny.’ This was a belief that American institutions were virtuous, that they would and should spread which would in turn redeem and change the world into the image of the United States and that this was the destiny of the great Anglo-Saxon race which had been ordained by God. For more information look here: