Where Did WWII Take Place?

WWII, or World War II, took place predominately in Europe. It also took place in America and Japan, at Pearl Harbor and Hiroshima. The main focus of WWII was the Nazis and Hitler.