What does FDIC Mean?

The FDIC is an independent agency that was created in 1933 by the federal government. Their main job is to promote public confidence in the financial system of the United States. They do this by insuring deposits and banks, and by addressing insurance fund risks.