What do Vitamins do for Your Body?

Vitamins are vital to help your body grow and develop. Vitamins are found in many foods we eat and can also be taken as an additional supplement.