President Biden is backing Vice President Kamala Harris to take the Democratic presidential nomination, he announced Sunday. She's taken more liberal positions on health care.
Mainstream America defines liberal as left and conservative as right. They are absolutely not familiar with neo liberalism, classic liberalism, or liberalism as a 1700's era political philosophical conversation.
To be fair, the idea that "the left" can't be liberal is itself pretty complicated.
The terms "left" and "right" are meaningless anyway and should be aboloshied. It just entrenches thoughtless "us versus them" tribalism instead of making politics about actual policies and issues and how people are affected.
Look, the American press struggles with these abstract political concepts. The words like liberalism, socialism, etc have lost all meaning.
But to some it up, her position in the 2019 primaries was somewhere in between Biden's and Sander's position. Basically a Medicare advantage for all (with straight public option included and available to all but private insurers not excluded just strictly regulated). I'm interested in what she comes out actually proposing now that she's most likely the candidate.
Look, the main stream American press struggles with these abstract political concepts. The words like liberalism, socialism, etc have lost all meaning.
What you mean is that the Republicans have spent decades on Red Scare bullshit trying to conflate Democrats with commies, and the media has been complicit in it.
That's a shitty play on words I assume. Socially liberal is typical leftist, economically liberal is usually right wing. So, left of the president on healthcare is good if it's socially speaking, bad if it's in the economic sense.
If it were intentional that'd be one thing, but the author isn't contrasting economic and social policy (health care is just economic) so I'm pretty sure he's just confused.
"Liberal" is opposed to "Authoritarian" and just means a person who favors democracy and personal freedom. Or it should, but in America our fascist conservative party has convinced people that "liberal" is a slur and also means "progressive," which is the actual opposite of conservative. But liberals aren't always progressive, which is why actual American leftists, who are progressives, use the term "liberal" to derisively refer to centrists. American centrists are politically conservative but hold some socially progressive values.
The wake-up occurs when you realize that politically/economically conservative policies lead to and support socially conservative ones. One can't actually be socially progressive but economically conservative, it's an incoherent ideology. Americans are raised to be good at double-think and distracting ourselves so we're able to cope with the contradiction.
Conservatives are the people selling myths like the morality of enlightened self interest, and liberals are the ones trying to make the current system fair
Leftists are the ones saying "holy fuck, come on guys, this is clearly insane. We grow way more than enough food for everyone, can we please start from deciding no one will starve and work backwards from there?"