Let me reverse the question:
If the United States is as racist, xenophobic, violent, and lacking in benefits as Democrats say it is, why do so many people from around the world come here for a better life (to use the Democrats' words)? Are Democrats not indirectly admitting that our country is pretty great after all?