What is Informed Consent?

Informed consent is when you have read and understood the information that has been presented to you and you then agree to carry out the activities based on your understanding. It is giving authority to doctors to carry out certain medical procedures or to researchers that perform research or experiment on you. Informed consent protects both the patient or participant as well as the experts and professionals.