Associations to the word «Centrism»

Wiktionary

CENTRISM, noun. Any moderate political philosophy that avoids extremes.

Dictionary definition

CENTRISM, noun. A political philosophy of avoiding the extremes of left and right by taking a moderate position or course of action.

Wise words

More wisdom is latent in things as they are than in all the words men use.
Antoine De Saint-Exupery