Associations to the word «Nigeria»

Wiktionary

NIGERIA, proper noun. Country in West Africa, south of the country of Niger. Official name: Federal Republic of Nigeria.

Dictionary definition

NIGERIA, noun. A republic in West Africa on the Gulf of Guinea; gained independence from Britain in 1960; most populous African country.

Wise words

Love. Fall in love and stay in love. Write only what you love, and love what you write. The key word is love. You have to get up in the morning and write something you love, something to live for.
Ray Bradbury