When Did America Enter World War 1?

The United States of America entered World War 1 on April 6th, 1917. The Armistice Treaty (between the Allies and the Germans) was signed on November 11th, 1918, marking the end of the war on the Western Front. The Treaty of Versailles, signed on June 29th, 1919, ended the war entirely.