What is the Definition of Body Image?

Body image is the perception a woman has when she looks at herself in the mirror. It can be positive or negative and is often influenced by the standards society sets about how women should look.