Associations to the word «Spain»

Wiktionary

SPAIN, proper noun. A country in Europe, including most of the Iberian peninsula. Official name: Kingdom of Spain (Reino de España).

Dictionary definition

SPAIN, noun. A parliamentary monarchy in southwestern Europe on the Iberian Peninsula; a former colonial power.

Wise words

The most important things are the hardest things to say. They are the things you get ashamed of because words diminish your feelings - words shrink things that seem timeless when they are in your head to no more than living size when they are brought out.
Stephen King