Associations to the word «Spain»

Wiktionary

SPAIN, proper noun. A country in Europe, including most of the Iberian peninsula. Official name: Kingdom of Spain (Reino de España).

Dictionary definition

SPAIN, noun. A parliamentary monarchy in southwestern Europe on the Iberian Peninsula; a former colonial power.

Wise words

It is better wither to be silent, or to say things of more value than silence. Sooner throw a pearl at hazard than an idle or useless word; and do not say a little in many words, but a great deal in a few.
Pythagoras