Does the West Any Longer Have a Left?
There is no sign of one in the traditional meaning of leftwing.
In former times, the left stood for the working class. Leftist literature was about class antagonisms. The left attacked the capitalists. Today the left is funded by them. The words–reform, justice, progressive–all meant different things in the 20th century than they mean today.
Today something called “left” looks with hatred at the working class—the white racist, misogynist “Trump deplorables.” The call is not to overthrow capitalist exploitation but to erase “whiteness,” by which is meant the power of white ethnicities in their own countries, along with their equal rights, monuments, history, and culture. A society has been forming for some years in which white people have become second class citizens. In the US they have lost the protection of the 14th Amendment to the US Constitution. It is legal to discriminate against them in behalf of “preferred minorities” in university admissions, employment, and promotions. White people have no protection against hate speech and hate crimes. Indeed, the law and the emerging new culture does not recognize any such thing as hate speech or a hate crime against a white person.
Today white people are denigrated by something called “left” with the same derision once used for capitalist exploiters. The difference is that today the exploiters are defined as the white working class.
As part of their agenda of cleansing society from white influence, the “left” has cooperated with the American Establishment in overthrowing a president regarded by the Establishment as a threat to Establishment agendas and by the “left” as a racist, misogynist, fascist. The left, formerly an impetus for reform is now a pillar of the Establishment and is funded by the Establishment.
Throughout the Western World white people are on the defensive to a much greater extent than capitalists ever were. The evil of hatred has come into its own, and deracinated white people stripped of confidence are unable to resist their positioning as objects to be hated.
Throughout the Western World those who try to stand up for the national racial ethnicities from which the name of the countries comes—whether English, French, German, Greek, Italian, Swedish—are demonized as “nativists” or “nationalists,” words that mean racist, Nazi or Fascist, and suppressed. No Western country has a ruling party that represents the white ethnicity that gives the country its name. In Germany, Britain, France, Italy, Sweden, and elsewhere the ruling party represents immigrant-invaders and maintains an open borders policy. In the US the Democrat party is committed to open borders. Illegals even vote in US presidential elections.
White people are a people abandoned by their own governments that their own votes put into power.
Nowhere in the Western World are the young and the immigrant-invaders being enculturated into Western civilization. Diversity, multiculturalism, and the special legal privileges of “preferred minorities” are gradually achieving the goal of erasing “whiteness.”
The decades old chant of brainwashed university students—“Western civ has to go”— is being achieved.
0 thoughts on “Does the West Any Longer Have a Left?”