What is Health Care Reform?

Health care reform is any policy that seeks to change or make better the way health care is now. Generally health care reform is a governmental policy.