Associations to the word «Deutschland»

Dictionary definition

DEUTSCHLAND, noun. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990.

Wise words

Kind words do not cost much. Yet they accomplish much.
Blaise Pascal