ANALYSIS/OPINION Racism in America has often been reduced to white-against-black discrimination, and understandably so. After all, it was white colonists who forcibly removed Africans from their homeland and brought them to America to serve as slaves on plantations. Following the Civil War and the end of slavery in America, this country continued to struggle with racism.…
Comment by clicking here.
Previously:
• 11/17/14: The two-party system is killing America
• 11/10/14: Probing more deeply into what ails the culture