Associations to the word «Deutschland»
Noun
Verb
Adverb
Dictionary definition
DEUTSCHLAND, noun. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990.
Wise words
The most valuable of all talents is that of never using two
words when one will do.