The Dental Profession
Dentists help patients maintain and improve oral health, quality of life and appearance by diagnosing and treating conditions that affect the teeth, tongue, gums, lips and jaws. Often they are the first to recognize and identify a number of illnesses including cancer and cardiovascular problems.
Dentists also perform trauma surgery, place implants, graft tissue to repair, restore and maintain teeth, gums and oral structures that have been lost or damaged by accidents or disease.
Education is an important part of dentistry. In the dentist's chair, many patients learn about how to maintain oral health and prevent disease, and dental professionals also play a leadership role in implementing community-based preventive programs, such as community water fluoridation, sealant programs or oral cancer screening.