When Did America Enter WWII?

The United States of America was pulled into world war II when the Japanese bombed Pearl Harbor, Hawaii. This occurred on Decmber 7, 1941, and on December 8, the U.S. declared war on Japan.