Associations to the word «Katar»

Dictionary definition

KATAR, noun. An Arab country on the peninsula of Qatar; achieved independence from the United Kingdom in 1971; the economy is dominated by oil.
KATAR, noun. A peninsula extending northward from the Arabian mainland into the Persian Gulf.

Wise words

Language is a process of free creation; its laws and principles are fixed, but the manner in which the principles of generation are used is free and infinitely varied. Even the interpretation and use of words involves a process of free creation.
Noam Chomsky