What is Federalism?

Federalism in the United States concerns the relationship between the state and federal governments. This is an evolving relationship that resulted mainly in a power shift from away from the states and towards the federal government beginning with the end of the American Civil War.