You can't turn on a cable TV news channel these days without seeing some TV pundit or panelist saying that Democrats need to "move to the middle." What the heck do they mean?
Do they mean that Democrats should say it's just fine that over 30 million Americans have no health care?
We love that the big banks make hundreds of billions in profits by ripping us off and putting our economy at risk?
That fossil fuel and chemical industries should continue to poison us and our planet?
That it's just fine if drug companies and for-profit colleges and Charter Schools keep ripping us off?
That it's just fine that the American Dream is dead and there's no need to bring back unions and union jobs?
What is "the center" beyond the status quo?