Associations to the word «Imperialism»

Wiktionary

IMPERIALISM, noun. The policy of forcefully extending a nation's authority by territorial gain or by the establishment of economic and political dominance over other nations.

Dictionary definition

IMPERIALISM, noun. A policy of extending your rule over foreign countries.
IMPERIALISM, noun. A political orientation that advocates imperial interests.
IMPERIALISM, noun. Any instance of aggressive extension of authority.

Wise words

A word is not a crystal, transparent and unchanged; it is the skin of a living thought and may vary greatly in color and content according to the circumstances and time in which it is used.
Oliver Wendell Holmes, Jr.