Associations to the word «Imperialism»

Wiktionary

IMPERIALISM, noun. The policy of forcefully extending a nation's authority by territorial gain or by the establishment of economic and political dominance over other nations.

Dictionary definition

IMPERIALISM, noun. A policy of extending your rule over foreign countries.
IMPERIALISM, noun. A political orientation that advocates imperial interests.
IMPERIALISM, noun. Any instance of aggressive extension of authority.

Wise words

Wisdom does not show itself so much in precept as in life - in firmness of mind and a mastery of appetite. It teaches us to do, as well as talk, and to make our words and actions all of a color.
Lucius Annaeus Seneca