Associations to the word «Katar»
Noun
| 1 |
Dictionary definition
KATAR, noun. An Arab country on the peninsula of Qatar; achieved independence from the United Kingdom in 1971; the economy is dominated by oil.
KATAR, noun. A peninsula extending northward from the Arabian mainland into the Persian Gulf.
Wise words
Words mean more than what is set down on paper. It takes the
human voice to infuse them with deeper meaning.