When Did the War of 1812 End?

The War of 1812 ended in 1815. The War of 1812 was ended by a peace treaty. England was forced to recognize America’s sovereignty as a nation.