What is Western Medicine?

Western medicine is a branch of medicine that is primarily diagnostic and based on internal medicine, and surgery, and other branch specialties.