Associations to the word «Namibia»

Wiktionary

NAMIBIA, proper noun. Country in southern Africa. Official name: Republic of Namibia.

Dictionary definition

NAMIBIA, noun. A republic in southwestern Africa on the south Atlantic coast (formerly called South West Africa); achieved independence from South Africa in 1990; the greater part of Namibia forms part of the high Namibian plateau of South Africa.

Wise words

Think twice before you speak, because your words and influence will plant the seed of either success or failure in the mind of another.
Napoleon Hill