Associations to the word «Deutschland»
Noun
Verb
Adverb
Dictionary definition
DEUTSCHLAND, noun. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990.
Wise words
More wisdom is latent in things as they are than in all the
words men use.