What Ended World War II?

World War II in Europe ended with the death of Hitler and with the collapse of Berlin. These events were the major turning points in the war because the generals that were left in charge after Hitler’s death basically surrendered. The war in the Pacific however, came to an end when the atomic bomb was dropped on Hiroshima and Nagasaki.