What does Anatomy Mean?

Anatomy is the science that deals with the structure of animals, plants, and humans. This field of science analyzes every detail inside and out of these three subjects.