What does Federal Government Mean?

In America Federal Government refers to a central government that sort of sits above the state governments. They mainly take care of foreign relations at least they are supposed to.