Associations to the word «Directorship»
Noun
Adjective
Wiktionary
DIRECTORSHIP, noun. The office of a director; a directorate
Dictionary definition
DIRECTORSHIP, noun. The position of a director of a business concern.
Wise words
Words are but symbols for the relations of things to one
another and to us; nowhere do they touch upon absolute
truth.