Sometimes the short sighted history that is used to brand all white people as evil does tend to really irritate me. Yes, white people held slaves. But white people have also BEEN slaves, and some are still today. Yes, black people were slaves, but black people also held slaves, and some still do today. All races have been both enslaved and slave holders.
Yes, white people brought disease to the New World. Just as it had been brought to new areas all over the world ever since the Garden was taken away.
Yes, white people "took" America. Just as every race has had some members that took from other races (and their own race) and some members that lost out to other races (and members of their own race.)
All through history, people groups have been trading diseases, war, slaves, taking and loosing land, etc. etc.
The biggest difference is that white people began to look on slavery, the killing of civilians, and other historical horrors with, well, horror, and they began to put an end to them. Today, there are many people of all races that look upon the slaughter of unborn children with the same horror early Christians looked upon infanticide, and are working hard to wipe out that next level of abuse against other people. (And look which party is in favor of continuing to inflict horror upon other people...)