Associations to the word «Kkk»

Dictionary definition

KKK, noun. A secret society of white Southerners in the United States; was formed in the 19th century to resist the emancipation of slaves; used terrorist tactics to suppress Black people.

Wise words

The most important things are the hardest things to say. They are the things you get ashamed of because words diminish your feelings - words shrink things that seem timeless when they are in your head to no more than living size when they are brought out.
Stephen King