What are the Winter Months?

The winter months are the coldest months of the year in certain parts of the United States. The winter months start on December 21 and end on March 21. In places like south Florida, that are closer to the equator, it doesn’t get cold like it does in other states in the United States. During these months, you can have ice storms, snow, sleet, hail, and freezing temperatures.