When is Winter?

In the United States, winter typically begins during the month of December, and continues until February or March of the following year.