Yes white people thought they were the most important last almost 100 years ago in my opinion, we have changed immensely as a nation and need to look past this demonization of our racist past as an outlook to our future. No race is more important than the other and this should very well be rippled throughout our society, We grow up hating eachother because of a bunch of crap we are fed from bad actors and people who just genuinely enjoy watching racial and cultural carnage. All facets of history should be taught for all to know what everybody in this world has faced.
Be the first to reply to this disagreement.