Are Dentists Doctors? Yes, Here’s why
Have you ever wondered whether dentists are doctors? Why do they have the title of “Dr.” in front of their name? Dentists do hold doctoral degrees, and hence are regarded as “doctors.” However, the question of whether a dentist is a medical doctor is more complicated. This article looks into a list of all professions that can use the title “doctor” and why all people who use the title “doctor” should not be treated equally in the healthcare industry.
(more…)