Associations to the word «Katar»

Dictionary definition

KATAR, noun. An Arab country on the peninsula of Qatar; achieved independence from the United Kingdom in 1971; the economy is dominated by oil.
KATAR, noun. A peninsula extending northward from the Arabian mainland into the Persian Gulf.

Wise words

All our words are but crumbs that fall down from the feast of the mind.
Kahlil Gibran