What is Feminism?

Feminism is a political philosophy whose aim is to advance the standing of women in society. Most feminists believe that women should be treated equal to men, if not better.