Associations to the word «Doctorship»

Noun

Adjective

1

Wiktionary

DOCTORSHIP, noun. Professional position or title of a doctor.

Wise words

Don't you know this, that words are doctors to a diseased temperment?
Aeschylus