What Happened After World War 1?

After World War 1 (or I), leading European nations, along with Japan, increased imperialism around the globe. Nations such as Germany and Italy saw their economies shatter, and welcomed fascist dictators Hitler and Mussolini. Spain disintegrated into civil war.