When Did the United States Enter WWII?

The United States entered World War II when the Japanese bombed Pearl Harbor on December 7, 1941. Congress declared war on Japan on December 8, 1941. Germany and Italy, Japan’s allies, declared war on the United States on December 11; the United States responded by declaring war on Italy and Germany. Look here for more information: