When Did the Us Enter WWII?

The US officially declared war on Japan on December 8, 1941 after the invasion of Pearl Harbor, HI the day before. This officially entered the US into the war, although she had been supplying Great Britain with money and planes for almost two years prior.