What Happened After World War II?

When WW II ended, the world emerged from the deep depression it had been in prior to the war. Additionally, Germany was split into two haves, the Democratic side and the Russian side. Many Jews left Germany for Israel and America. Very important to note, the United Nations finally emerged after WWII. For more information look here: