Associations to the word «Deutschland»

Dictionary definition

DEUTSCHLAND, noun. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990.

Wise words

Hope is the word which God has written on the brow of every man.
Victor Hugo