Associations to the word «Deutschland»

Dictionary definition

DEUTSCHLAND, noun. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990.

Wise words

Wisdom does not show itself so much in precept as in life - in firmness of mind and a mastery of appetite. It teaches us to do, as well as talk, and to make our words and actions all of a color.
Lucius Annaeus Seneca