Associations to the word «Uruguay»

Pictures for the word «Uruguay»

Wiktionary

URUGUAY, proper noun. A country in South America. Official name: Oriental Republic of Uruguay

Dictionary definition

URUGUAY, noun. A South American republic on the southeast coast of South America; achieved independence from Brazil in 1825.

Wise words

The chief difference between words and deeds is that words are always intended for men for their approbation, but deeds can be done only for God.
Leo Tolstoy