It's an ingrained racism that pervades all of Western culture. I'm a liberal white man and I absolutely abhor racism but it's ingrained in our culture, the US thing is the focus point that should make us all take a long hard look at ourselves. Our daily interactions with people of a different colour, our preconceptions, the way we promote stereotypes. We're not innocent in this, everyone needs to fight this fight and also accept occasions when we've been wrong
I don't mean this to start a fight or get at anyone, just want to point out it's all our fight to make this better than it is