Associations to the word «Gambia»

Wiktionary

GAMBIA, proper noun. A country in Western Africa. Official name: The Republic of the Gambia.

Dictionary definition

GAMBIA, noun. A narrow republic surrounded by Senegal in West Africa.

Wise words

Watch your thoughts, they become your words. Watch your words, they become your actions. Watch your actions, they become your habits. Watch your habits, they become your character. Watch your character, it becomes your destiny.
Anonymous