Skip to main content
An official website of the United States government
Español

Western medicine

(WES-tern MEH-dih-sin)
A system in which medical doctors and other health care professionals (such as nurses, pharmacists, and therapists) treat symptoms and diseases using drugs, radiation, or surgery. Also called allopathic medicine, biomedicine, conventional medicine, mainstream medicine, and orthodox medicine.
Search NCI's Dictionary of Cancer Terms