Thursday, May 13, 2010

The First Noble Falsehood

All of the religions of the Classical Civilized Paradigm have in common a core belief about the relationship between the soul and physical existence. This belief is expressed in different ways in different faiths, but the most elegant expression in my opinion is the Four Noble Truths of Buddhism, first of which is that all life contains the element of suffering, or, more simply and as believed in practice, that all life is suffering. For Buddhists, the joys and pleasures of life (while acknowledged to exist) are, in essence, the bait for a trap. They’re here to bind us to the world so we can suffer more. The point is to get out.

No other religion puts it quite that way (or, in my opinion, quite that well), but all of the Great Religions share that conclusion: the point is to get out. We don’t belong here. We belong somewhere else or in some other conditions: in Heaven, in Paradise, reunited with God from Whom we have become separated, or restored to the bliss of non-manifestation. The details vary widely, of course. Religions of the Hindu/Buddhist complex believe in reincarnation or soul transmigration, and so teach that we may go through many incarnations before finally being freed to go where we belong. Religions of the Abrahamic lineage (Judaism, Christianity, and Islam) don’t have this belief, and so teach that there is a single lifetime after which comes God’s judgment and (hopefully) a passage to the place where we belong. But in none of these faiths is incarnate, manifest existence on the physical plane seen as anything but a mistake.

This idea – that we are here by mistake, and our focus should be to remove ourselves somewhere else – is what I call, in a play on the Buddha’s teaching which I’m sure he will have enough enlightenment to forgive, the First Noble Falsehood.

In fairness to the Buddha, we can’t be sure that this is what he actually taught. No writings by him have survived, and that’s rather a mystery. The Buddha, who was a prince, was certainly literate. Did he really not write down any of his teachings, or were his writings lost – or suppressed – after his death? We may wonder the same thing about Jesus and Mohammed, who were also literate and who have also left us no writings. Be that as it may, an inevitable disconnect occurs in communication between the teachings of an enlightened spiritual leader such as the Buddha or Jesus, and the form those teachings take when embodied in an organized religion, especially after the old boy isn’t around any longer to interfere with the process.

Part of that disconnect arises simply because of the difficulty of communicating the deep truths of the spirit in words. Language isn’t designed for that purpose. Its vocabulary communicates things from one person to another that both people are already familiar with. In giving you directions to the post office, I know that you know what a street is, what a post office is, what it means to turn left or turn right, and how to identify various landmarks that I may give you to show the way. If someone wants to communicate something that is a bit outside his listeners’ experience, then one starts with the things the listener knows and builds on that. But to communicate spiritual reality is very, very difficult, because it is outside of the experienced world of most people. It can usually only be done in metaphors and parables, and even then most people will attach meanings to the parables that aren’t correct. “He who has an ear, let him hear.”

But beyond this, further problems came in for all Classical Paradigm religions as they became established and involved in politics. Ideas and teachings that did not serve the political purposes of the faith were suppressed, and ideas that were necessary for that purpose were introduced. And it is at this juncture, I believe, that the First Noble Falsehood arose.

Politics is in large measure about privilege. It’s a battleground between those who enjoy an elite, entitled, privileged position, and those who would like to see them lose it. Sometimes those who would like to see them lose it simply want to replace them as the privileged elite. Sometimes the goal is to eliminate privilege altogether or at least reduce its prerogatives. In modern times that latter manifestation has become more acute, but in the ancient times when the Buddha lived, the elite (to which class he himself was born) had things pretty much any way they wanted. Politics – a game the Buddha’s social class played exclusively, commoners need not apply – was thus all about upholding and supporting their status and power, except of course when it involved infighting among them for a larger share of power than one’s fellows.

Religion was, under the Classical Paradigm, always a partner with the state. Often, it was an actual arm of the state. As such, it always served the purposes of the state, which included public order, but also included the upholding of privilege. Now privilege, to the enlightened eye, is unjustifiable. It is simply wrong, and needs to be abolished. And so, if religion is to serve the purposes of the state, a filter must exist to allow the passion which an enlightened teacher’s teachings inspire to be twisted into the service of privilege, something he would in all cases have abhorred. And the First Noble Falsehood serves that purpose admirably, by taking that passion and turning its focus away from this world altogether and towards another reality where there is no social privilege to be threatened.

The challenge to privilege and power represented by spirituality when focused on this world where we actually live can be seen in modern times. Look what Gandhi accomplished for Indian independence or Martin Luther King for racial equality in America. There is magic in genuine spirituality, a power of the heart that moves the hearts of others. To the holders of privilege, spirituality is dangerous, and must be channeled into non-threatening, or ideally even privilege-supporting paths.

Jesus was not put to death on a whim.

The story of Christianity’s co-option by the Roman state in the 4th century CE is a good illustration of how the process works, and where the First Noble Falsehood comes from. I said above that under the Classical Paradigm religion was always intertwined with the state, but actually early Christianity represents an exception. Under Roman law, Christianity was an illegal religion, but the laws were seldom enforced. Christians could be arrested any time, given a chance to recant their faith, and executed in brutal ways if they refused, but most of the time they weren’t. The net effect was that Christians were free to practice their religion (except when some Emperor or other got a bug up his sphincter and started a persecution), but was not involved with the state at all. It couldn’t be, because it was illegal. Nor could intra-faith politicians (you know the type, all religions have them) call on the state to enforce their authority. For that reason, during the time between the mission of Paul and the Council of Nicaea, Christianity was one of the freest and most diverse religions of all antiquity.

It was also frequently troublesome. Christians refused to pay token worship to the official state religion, and so in the view of the state endangered Roman society’s relationship with the Gods, on whose favor it depended. Christians were also frequently pacifists and anti-slavery advocates, thus endangering the Roman state’s ability to defend itself against foreign enemies and the foundation of the Roman society’s economy. Here was spirituality acting as a threat to privilege, which it so often does.

Three Emperors attempted to stamp out the religion through persecution. None succeeded. In the early 4th century, the emperor Constantine tried a different approach: co-option. By repealing the laws against Christianity, by calling a council of “Bishops” (note that this in itself favored the more authoritarian forms of Christianity over the less so, since only the authoritarian Christian sects recognized “Bishops” to start with) from all over the Empire to iron out exactly what the faith stood for and taught, and finally by making Christianity itself the state religion of the Roman Empire, Constantine and his successors transformed it from a rebellious and dangerous spirituality into a useful tool of politics. One of the primary levers of this transition was to bring to the fore, and interpret in certain pro-privilege ways, the otherworldliness that had always been an element in the religion. Rather than oppose war and slavery in this world, believers were taught to focus on the next, where there was no war and where everyone was free. By the time of the Roman Empire’s fall, Christianity had become wholly a tool of civic authority and the defense of privilege. The First Noble Falsehood was an important part of that transition.

The transition to that state for other religions is less visible, and in some cases I suppose it’s possible that the First Noble Falsehood was enshrined from the beginning so that no actual transition occurred. What is certain, however, is that so long as religion and the state remained partners in politics, any religion allowed to survive would of necessity transfer any passions for reform from this world to another.

With the separation of church and state which has become the modern norm, spirituality is now free to manifest itself in worldly ways and has begun to do so once more. It is, I think, time to abandon the First Noble Falsehood. We are not here, embodied in physical existence, as a mistake, and our goal is not to leave this life for another, but to make of it the best and holiest thing we can, in love of those around us and in homage to the principles we cherish.

Sunday, May 2, 2010

Personal Power and Political Power

“Money is power” is a cliché. That wealth leads to power, and vice-versa, is so well-known that an important underlying question is often missed. One finds arguments about which of the two is the more important for the commercial elite in our society – again missing that underlying question.

The important underlying question that’s often missed is this: what KIND of power? Are we discussing political power or personal power?

Political power is the ability to influence governing institutions. It’s held (obviously) by elected officials. Barack Obama, at present, has a great deal of political power. He can issue orders and have them carried out by the agencies of the U.S. government, by the United States military forces, and by the Democratic Party which he heads. He can use the persuasive power of his office to influence votes in Congress. To a lesser degree, all elected members of Congress also hold political power, as do Cabinet members and other important unelected government officials and those holding office in state and local governments, or in foreign governments throughout the world. Political power is also wielded by those who don’t hold government offices but who, through campaign contributions and lobbying, or through the ability to persuade a following among the citizens, can influence the actions of the government. Most of the time, when people speak of “power” being held or valued by the commercial elite, this is what they mean: the ability to influence government actions by means of persuasion and bribery.

If that sort of power, political power, is what we’re talking about, then I’d have to say on the whole – with a few exception – that money is more important to the commercial elite and power is only a means to the end of amassing more money. But there’s another sort of power that lies, I believe, at the heart of all desires to become mega-rich in the first place. Sometimes, for some people, it lies at the heart of a desire for political power as well. Both money and political power can be means to the end of amassing personal power: the ability to make other people, as individuals, into servants of one’s own will. Political power, in extreme or archetypal form, is exemplified by the dictator. Personal power, in extreme or archetypal form, is exemplified not by the dictator but by the slave owner.

Personal power is embodied in the powerful by a sense of superiority, and in the powerless by a sense of inferiority. Personal power lets a powerful person look at someone over whom he has power and say to himself – and, through various gestures and subtle means of communication, to his inferior as well – “I am better than you,” and makes the inferior say in the same ways, “You are better than I.” Unlike political power, it’s a very primal sort of power, with roots going back to the origins of our species. It pumps the body full of adrenaline and testosterone, or churns the guts with loathing, fear, and self-hatred.

Personal power is a man’s ability to seduce another man’s wife right in front of him, and have her be afraid to say no and him be afraid to do anything about it. Personal power allows a person to demand that others bow and scrape and show their submission. Personal power allows cruelty to others without penalty, and enables retaliation for even the most minimal slights. Personal power is what the power-hungry desire on a visceral level, and freedom from anyone having personal power over us is what we mean in our hearts by the word “liberty.”

Personal power is a face-to-face thing. Unlike political power, it isn’t impersonal power over the masses, but one-on-one power over an individual. It’s the ability of one individual to make another grovel, serve, and obey.

Government officials seldom hold personal power over ordinary citizens – we seldom interact with government officials in any direct way. They have personal power only over their employees, interns, and so on, and those who come within the purview of their immediate jurisdiction under the law.

Employers, on the other hand, always have personal power over their employees. We have laws protecting the rights of workers for that very reason, to limit the consequences of personal power. Landlords have personal power over renters, and we have laws protecting tenants’ rights for that reason. All such laws were fought tooth and nail by employers and landlords when they were proposed, partly because obeying them is often an expense, but in large part because it removes some of the payoff of personal power.

Every time an employee successfully starts a small business, or becomes self-employed, he gains freedom. The commercial elite may still have a lot more money than he does, but he is no longer dependent on any of them. No employer holds personal power over him. Every time a person buys his own home, he gains freedom. No landlord holds personal power over him.

That’s the underlying, unspoken reason why the pressure is on to keep wages suppressed in America. It’s not the only reason, of course; it’s reflexive for business owners for whom wages are a cost to be kept down, and who seldom consider the larger picture. But as long as wages are kept low, the number of people who will be able to escape from wage work and become free is limited, and so is the number of people who will be able to afford their own homes. With more and more money funneled to the very rich at the top of the ladder, they have more money to play with and gamble with, but at least as important is that the majority of the people are kept on the treadmill, where they can be controlled. Where they can be told what to do, and made to serve.

Personal power needs to be recognized and understood. We need to stop thinking “government” reflexively when we use the word “power.” Sure, government power is important and potentially dangerous. We need to make sure it is restrained by the three safety controls we put on it: separation of powers, public accountability, and explicit limits of government action such as the Bill of Rights. When these become frayed, as they have in recent years, we need to restore them.

But on a visceral level, the government is not what most people think of when they imagine freedom. They think of their boss, or their landlord, and being able to tell them to shove it. They think of being in a situation where no one can tell them what to do. The real enemy of freedom in a democracy is not the government, but rich and powerful individuals able to exercise personal power. To judge whether a government is a tyranny, a good rule of thumb is to ask to what extent it serves the interest of rich and powerful individuals – helping them to exercise personal power over others. To say that government secures and protects people’s rights is another way of saying that it protects the weak from the strong. A tyranny instead aids and abets the strong in dominating the weak.

Of course, the rich and powerful often try to confuse the issue by saying that a government interfering with their freedom to tyrannize others is a tyranny, and to them, it is – as it has to be; if it weren’t, it would be a tyranny to the rest of us. In just that way the slave owners of the antebellum South complained of the tyranny of Washington. We need have no more sympathy for our capitalist masters today than we do in hindsight for the plantation masters of yesterday.

Wednesday, April 21, 2010

Real Conservatism: Bring Back the Federalist Party!

Conservative, adj.:

1. disposed to preserve existing conditions, institutions, etc., or to restore traditional ones, and to limit change.
2. cautiously moderate or purposefully low: a conservative estimate.
3. traditional in style or manner; avoiding novelty or showiness: conservative suit.

None of those three definitions describe people who self-identify as "conservatives" in American politics today. And therein lies the problem.

A healthy political dialogue in a progressive society would occur between progressives on one side and conservatives on the other. Progressives would push for change, identifying problems that need fixing or opportunities to achieve something, to make society more egalitarian, wealthier, healthier, better-educated, more enlightened, more peaceful, fairer, more just, freer, etc., etc. Conservatives would object with "yes, but" arguments. But do we really need to make this change at this time? But look at the cost! But consider the unforeseen consequences. The subtext of all of which is: We agree with the overall goal. But let's not be hasty. Maybe this isn't the best way to do it, or the best time.

It's a useful -- in fact, necessary -- function in the dialogue, conservatism. It's necessary because (let's face it) progressives aren't always totally smart. On occasion, we can be profoundly stupid. Half-baked. Overly zealous. Insufficiently mindful of costs, social and political realities, and unintended consequences. So it pays to have a conservative side of the dispute, frustrating though we may sometimes find it, to insist that progressive ideas prove themselves in imagination and accounting before they're actually implemented.

But that function can only be served by real conservatives. Wingnuts need not apply. Those who reject, not only the half-baked hasty ideas sometimes generated by progressives, but the very idea of progress, are not conservatives, because one of the cardinal principles of conservatism is to support the traditional values of one's society, and the traditional values of the United States of America are progressive. You know, things like "We hold these truths to be self-evident, that all men are created equal," or "government of the people, by the people, and for the people." Conservatives -- real conservatives, that is -- hold to the same progressive values as progressives do, they're just more cautious about implementing them and less convinced at any point about the size of the step we're ready, as a society, to take.

The problem with conservatism in today's American politics is that the term has been hijacked. It no longer applies to real conservatives, it applies today to wingnuts who reject progressive ideals altogether. It applies to people who don't believe in the secular, Enlightenment-based democracy that America traditionally seeks to build, but would instead create a theocracy. It applies to people who don't recognize the value of a multiracial, tolerant society, but would have a white people's country. It applies to those who advocate, not a cautious approach to change, but a radical one -- in anti-progressive directions. To re-criminalize abortion is not conservative, it's a radical change. To abolish such long-standing government functions as Social Security, Medicare, aid for the poor, regulation of the economy, even public education, is radical. To end the separation of church and state and create a Christian government and legal base is radical. To bring effective democracy to an end and hand all political power over to a corporate plutocracy is radical. Conservatives do not advocate radical change. And so the people who advocate these radical changes are not conservatives.

Over the course of elections since 1980, I have watched the wingnuts take over more and more of the Republican Party from the true conservatives that used to dominate it. I kept hoping that the process would have a natural limit, that the GOP would come to its senses at some point and return to traditional American values and its own previously-solid conservative function in the dialogue. It's still conceivable that they may, but given the depths to which the party has sunk at this point, I think we need to entertain and plan for the contingent possibility that they also may not. What happens then?

There are still a few conservatives in the Republican Party, and also quite a considerable number of them in the Democratic Party, but Republican conservatives have become an endangered species. (Due to the wingnuts having hijacked the term "conservative," these Republican conservatives are nowadays known as "moderates." I refuse to cooperate with that theft of a perfectly good term by those to whom it does not properly belong, and so insist on calling these politicians conservatives, which they are.) We hear today that Florida Governor Crist, a conservative who will almost certainly lose the GOP Senate primary this year to a wingnut, will probably ditch the GOP and run as an independent. A few conservative Republicans have already left the party and either become independents or joined the Democrats. John McCain of Arizona is another conservative who faces a primary challenge from a wingnut, and although he has not indicated any inclination to jump party, none of us should rule out the possibility at this point. The surge of wingnut Republicans zealously trying to rid the party of conservatism has become endemic.

At the same time, as we progressives are painfully aware, the conservatives within the Democratic Party are making it harder for progressives to achieve what they should be doing. Well, of course that's what conservatives are supposed to do, but the problem is that the progressive-conservative dialogue, which has mostly become intra-Democratic, is in turn hampered by the howling wingnuts on the other side of the aisle. It's very difficult for Democrats to manifest both sides of a healthy political dialogue (progressive and conservative), and at the same time present a united front against wingnuttery. There's a strong tendency for people on our side of the discussion, that is to say, progressives, to turn upon conservative Democrats in wrath and insist that they be replaced by progressives, a desire that is amplified by fear and loathing of the wingnuts. Our political landscape is rapidly changing from one of progressives and conservatives to one of progressives and wingnuts, with conservatives squeezed out of the picture altogether.

Folks, that is not a good prognosis! We NEED conservatives, and we certainly do NOT need wingnuts! So I think it may be time to consider some practical contingency plans for bringing conservatives, real conservatives, non-crazy conservatives, conservatives-not-wingnuts, back into politics with a home of their own.

The simplest and best solution would of course be for the Grand Old Party to recover from its thirty-year binge and return to sobriety. Let the "big tent" goppers win the intra-party argument. Let the wingnuts be consigned to the wings and fringes where they belong. Let genuine conservatives again take their proper places as the loyal elected opposition. A nice dream. Maybe it will become real. But I'm no longer willing to hold my breath waiting.

Failing that, what we may need is a new political party. Governor Crist, rather than running as an independent, should found this party. I don't have any idea what to call it -- well, sure I do; it could be called the Conservative Party. But maybe that has too much potential confusion with the British party of the same name. The Party of Sanity is too flippant, as is the Party of Non-Wingnuts. Ah! I have it! We can bring back the oldest, most original name for an American party of conservatives there is, and call them the Federalist Party. Or maybe they can come up with a better name themselves, but I'll use that tag provisionally here.

The Federalist Party would include such Republicans as Crist, Olympia Snowe, Tom Campbell, John McCain, Arnold Schwarzenneger, and similar targets of wingnut loathing. It would also find room within its ranks for Democrats (and ex-Dems) such as Blanche Lincoln, Joseph Lieberman, Ben Nelson, and so on. Since these people would no longer be competing in Republican (or Democratic) primaries, there would be no pressure on them to adopt wingnut positions and they could remain true to their conservative beliefs, and let those positions run honestly and fairly in general elections versus both progressives and wingnuts.

After all, there's really only one reason why the wingnuts are getting anywhere at all: they are big fish in an increasingly small pond, as the number of voters willing to call themselves Republicans declines, and the smaller and smaller numbers that remain are increasingly dominated by wingnuts. This means that on election day, it becomes increasingly likely that one of the candidates in every election will be a wingnut. So I say, let that process reach its logical conclusion, let the Republicans become purely a wingnut party, and let those Republican conservatives who remain have somewhere else to go besides the Democrats. Since under those conditions wingnuts would win very few elections indeed, the Republicans would, over a few election cycles, quickly go the way of the Whigs, and future elections would be mainly between the progressive Democrats and the conservative Federalists. (At least until we adopt proportional representation so that we can have more than two active serious political parties. But that's a change of subject.)

Would this be better or worse in terms of elections for progressives? I'm going to be have to be honest here: it would be worse. There's no question that, most of the time, a progressive can beat a wingnut in an election more easily than a sane conservative. So if all we care about is the short-term goal of electing progressives, a return of genuine conservatism isn't a good thing. But I don't think that is all we should care about. It also must be recognized that wingnuttery does not deserve to be represented in Congress, yet in many districts, replacing wingnuts with progressives is simply not feasible; the people might be uneasy with their wingnut reps but they don't want any dad-gum lib'ruls neither. So a real, true conservative Congresscritter would be the realistic alternative, better than a wingnut because, well, anything is, and better than a progressive because he or she would represent the people of the district, which a progressive (in all honesty) would not.

Let me repeat the first sentence above: A healthy political dialogue in a progressive society would occur between progressives on one side and conservatives on the other. That's something we don't have any more. It would be good if we did. We might not elect as many progressives that way as when the only alternative to progressives consist of clowns and zanies, but on the other hand we would elect no clowns and zanies. And that would be better for America.

Sunday, April 18, 2010

There's Racism, And Then There's Racism

Is the Tea Party movement racist? Seems to me it is or it isn't depending on what kind of racism one means.

There's no question that opposition to President Obama from the right is vehement to a degree not really explained by his policies. This is not unlike the wild opposition to President Clinton, who was less progressive than Obama but also incurred loathing and fear on the right. Because Obama is black, the idea has arisen (and a certain amount of polling data in support of it has been presented) that this vitriol is based in racism. The fact that something similar was encountered by President Clinton, who is white, would seem at first glance to argue to the contrary. In fact, I contend that it supports the idea, if one examines the likely explanation for what DID generate that opposition.

Some racism is overt, crude, unsubtle, and blatant. Some racism is covert (or even unconscious), subtle, internalized, and unacknowledged even to oneself. Very little of the opposition to Obama is the result of overt racism. But a great deal of it is at least in part the result of covert racism.
An overtly racist objection to Obama would exist when a person feels, and admits to himself or herself (if not always to others), that a black person should not be president. Evidence of overt racism would be found when a person actually says something like this, or when a person is affiliated with a racist or white nationalist organization (e.g. when the person is a regular poster at Stormfront). Some of this does exist of course, but I am prepared to accept that the overwhelming majority of the Tea Party movement isn't part of it.

A covertly racist objection to Obama would exist when a person has no problem with a black person being president, but does have a problem with the idea that a black person could be elected president. That is to say, the person holding this attitude doesn't think black people are inferior to white people or inherently unqualified to be president, and may be willing to acknowledge that Barack Obama is a sharp guy who is just as capable at the job as a lot of white guys who have held it before him. It's not him. It's what his being elected says about what has happened to America.

I had a similar impression about the vitriolic opposition to Bill Clinton. Clinton's a white southern boy, of course, but he's also a notorious womanizer who evaded the Vietnam draft, smoked dope, grew his hair long, and married a tough feminist b**ch. For those who are inclined to freak out about the cultural changes that occurred over the 1960s and 1970s, he was a walking red flag, not because of his politics (which are pretty far right as Democrats go), but because of his cultural trappings and who he is as a person. In their America, the America they fondly remember from their childhood and would like to believe still exists -- in the REAL America, as they imagine to themselves -- someone like that would provoke revulsion and could NEVER win the nomination of a major party, let alone actually be elected. The vitriolic opposition wasn't really about him. It was about what his electability said about how America had changed and in what directions.

This impression was reinforced during the impeachment fiasco, which led Paul Weyrich to say, "I no longer believe that there is a moral majority. "I do not believe that a majority of Americans actually share our values. If there really were a moral majority, Bill Clinton would have been driven out of office months ago. It is not only the lack of political will on the part of Republicans, although that is part of the problem. More powerful is the fact that what Americans would have found absolutely intolerable only a few years ago, a majority now not only tolerates but celebrates."

In a similar way, Barack Obama's election says something about what America has become and is becoming that some people don't want to accept. And that change is not so much cultural as racial. Although there are cultural overtones, too.

White people are in the process of becoming a minority in this country. It hasn't happened yet, but it's in train. When it does happen, whites will be the largest minority, but still will represent less than 50% of the population. In the sepia-toned memory photographs of Obama's detractors, real America is a land predominantly of white people. Sure, it has nonwhites in it, and if you ask these guys they'd happily tell you that racial discrimination and Jim Crow and segregation and all that nasty stuff from our past had to go and they're glad it's gone. At least most of them will, and most will even mean it and believe it. But what they envision is an America of white people who are magnanimously, righteously non-racist and willing to generously tolerate and accept minorities in our midst on a (somewhat) equal basis, 'cause that's what great and wonderful people white Americans are. The idea of white people no longer being a majority, and thus no longer able to call the shots and be magnanimous and generous and so on, that doesn't sit well. But that's what the future holds.

As we approach that future, it becomes increasingly probable that someone non-white will gain the White House, and now it's happened. Obama was elected because America has a whole lot of black and hispanic citizens who voted for him in lopsided majorities. Obama was elected in addition because a whole lot of young people -- including young white people -- don't care that the country is heading for a white-minority future. Obama being elected president says that the uncomfortable future is closer than they thought, and his dusky face on the television above the presidential seal is a harsh reminder that the world of those sepia-toned memory photographs no longer exists. It makes them feel out of place in the world that surrounds them now.

And that feeling infects everything else, and magnifies small political objections into big ones, and causes irrational and unbelievable accusations to be believed without serious critique.
It isn't racist in the sense of being bigoted and thinking no black guy should be allowed to be president or can possibly have the smarts for it. But it is racist in the sense of being based in a lament for the fact that America is rapidly ceasing to be a white people's country.

Saturday, April 10, 2010

Slavery, Serfdom and Wage Work: The Forms of Coercion

I continue this week to encourage radical thinking, and to build on the post from last week. Last week, I explored the origins of capital property ownership, how it separates the right to own wealth from the work to create it, and the consequent nature of profit as a form of theft.

This week, I want to explore the lynchpin of all class privilege throughout the history of civilization: the ability to coerce the labor of others for the elite’s profit and the elite’s ends. Historically, there have been three broad methods by which the labor of the many has been channeled to the ends of the few, declining in brutality and increasing in subtlety from one to another, but all of them coercive in one way or another. These three methods are slavery, serfdom (and variants), and wage-work.

I don’t mean to suggest that there is perfect moral equivalence among the three. To be a wage worker is immeasurably better than to be a slave. The abandonment of slavery, and the near-abandonment of serfdom, really does represent progress in human rights and the human condition. But while working for wages is certainly not slavery, it is no more accurate to call it freedom. The only people who are free are those without masters, without bosses – those who work for themselves.

There was a time, early in the history of civilization, when that was pretty much the case for most people. The normal condition for a person in ancient times was not that of a hireling but that of a small farmer or craftsman, an owner of one’s own business. Working for someone else for pay was thought of as a transitional phase, something one did in order to learn a craft or to acquire the necessary capital to buy one’s own land. And of course, working for someone else was completely unknown in precivilized times. The transition to the current situation, in which the overwhelming majority of people work at jobs serving the profits of others, with no entitlement to the fruits of their own labor, did not develop overnight. The circumstances of servitude have grown less severe with the passage of time, but at the same time the condition of actual freedom has grown rarer and rarer.

One of the earliest forms of working for another, and the first to be employed on a large scale, was slavery. We may consider this the template. Initially, slavery probably arose as a consequence of war. When the victors in a war conquered an enemy, they gained more than the land that the enemy had occupied. They also gained the surviving enemy citizens as captives. Even when the conquest was less complete than that, captives were often taken in the course of the fighting and could be brought to the homeland and forced under threat of punishment to work for the victors. Of course, just as with the enemy’s land, the enemy people became disproportionately the property of the elite, who found themselves the owners of large tracts of land worked by slaves and generating a lot of money without the owner having to work on it at all. (Profit being theft, as noted last week.)

Over time, slaves became property to buy and sell just like land itself, and the pattern emerged of a class of warrior-aristocrats living off the labor of people who had no rights under the law (or few, depending on the society) and whose only purpose in life was to serve the interests of their masters. This became the template for all elite classes from that time forth. Like most prototypes, it was crude and unsubtle compared with the more sophisticated ways of compelling labor that followed. It suffered from numerous disadvantages, including slave revolts and a lack of motivation on the part of the workers. Nevertheless, it sufficed to keep the aristocratic class in wealth and power for thousands of years and in many different civilizations. Even more importantly, the underlying idea that the elite deserved to be served by a class of workers and to become rich from their labor became so entrenched that it survives to this day, many years after slavery itself has been outlawed.

One problem with slavery is that it was universally unappealing to the slave. (Or nearly so. There are instances in ancient times of highly skilled persons selling themselves into slavery, knowing that their skills would earn them favored treatment and a better circumstance than they could achieve in freedom. However, that’s the exception; very few slaves ever became slaves by choice.) People resisted becoming slaves and had to be forced into it when captured in battle or condemned for debt or for some other legal offense for which slavery was the penalty. There was really no way to reduce most of the people to a state of slavery, because the numbers of slaves would have proven impossible to control by the number of free people. In order to increase the number of people who could be reduced to servitude, it was necessary to make the conditions of servitude less drastic than was usually the case with slavery.

Some examples may be found prior to the industrial revolution of a form of coercion gentler than slavery, but still more direct and brutal than wage work. This consisted of a defined set of obligations on the part of a worker, who was forbidden under most conditions to leave his employment, but who also had more rights under the law than a slave. I’m going to call this sort of arrangement “serfdom,” but I should explain that I’m talking about a broader category of social arrangement than serfdom proper. The peasantry of medieval China or Japan, or the sharecropping and tenant farming arrangements in the post-emancipation American South, fit into this general category, as well as the condition of the medieval European serf. Because serfdom was less onerous than slavery, because it entailed some rights on the part of the serf and some obligations to the serf on the part of the master, it was possible to have a larger population of serfs than could be maintained as slaves. Even so, it turned out not to be as perfect a solution as wage work: the industrial-era answer that has turned nearly everyone into a tool of the elite.

Anyone can see how slavery and serfdom are coercive arrangements, because the victim is punished for refusing to work. But in the case of wage work, the coercive nature of the institution is less evident, because a wage worker is not directly punished for refusing to work. The only punishment is to withhold a reward: failure to work means the worker will not be paid. But it is still coercive, and the coercion still takes the form of punishment or threat of punishment. It’s just not applied by his immediate employer, nor directly for refusing to work. The coercion applied to a wage worker is applied before he ever accepts a job. It is built into the system of ownership that concentrates possession of capital property into a few privileged hands. It punishes the wage worker, not for refusing to work, but for attempting to work using capital property that belongs to the elite. Since he cannot obtain capital property of his own, he is unable to produce wealth on his own for his own use or for sale to others. As such, he has no independent way of supporting himself. He must work for the profit of another, in return for the means to support himself and his family. Rewards are sufficient motivators only to the extent that the person receiving the reward suffers from deprivation. If the wage worker can support himself through his own labor on his own behalf, rather than in service to another, then his desire for monetary reward is satiated, and he will have no reason to surrender his liberty. The rat will run the maze in return for food pellets, but only if it is kept hungry.

Because the ability of an employer to apply direct coercion is limited, and because the wage worker is allowed by law to voluntarily leave his employment, refusing to work but giving up his wages, it carries a greater semblance of freedom than either slavery or serfdom. It has been possible to argue that a wage worker “voluntarily” enters into an employment agreement, and so is actually free. The argument is specious, of course, because the only way the agreement could genuinely be voluntary is if the worker had the right and opportunity to support himself without a master. When the alternative is starvation, no real choice exists. It has also been possible to argue, with equal speciousness, that the worker rather than his master owns the fruits of his labors, by confusing the real fruits of his labors – the goods or services that his labor creates – with the reward his employer offers for surrendering them. Let there be no confusion on this point. A wage worker is not a slave, nor is he a serf. But he is most certainly not free.

In addition to keeping capital property concentrated in few hands – actually, in service to that necessity of universal coerced labor – it has also been desirable from the standpoint of the elite to keep the rewards paid for wage labor as low as practical. This was desired partly so as to maximize the share of wealth held by the elite, of course, but also to reduce the chance of a wage worker freeing himself by saving sufficient money to go into business or, through investments, to support himself without working. Even if a worker is unable to completely free himself from servitude, if he is well paid and lives within his means, his options become wider and he is much harder to manipulate. If asked to do something unacceptable, an employee who can survive without work for a year or more is much more likely to quit than one that lives paycheck to paycheck.

In the end, it’s all about power, even more than about money.

And that will be the subject of next week’s post.

http://www.smashwords.com/books/view/8357

Sunday, April 4, 2010

Profit Is Theft

One of my purposes in writing this blog is to encourage radical thinking. Not necessarily radical action (although radical thinking does radicalize action to a degree), but thinking that cuts through the false assumptions and intellectual ruts at the roots of a lot of habitual thought in politics, economics, religion, and art. If we can think radically, possibilities open to our consideration that we would never even imagine otherwise.

This week, I want to discuss two concepts that are crucial to any capitalist economy, and that are older than civilization, but much younger than the human race: the private ownership of capital property, and the related concept of profit. These were, for their times, radical ideas. Today, pointing out that they are not inevitable or natural ideas has itself become radical, and so doing that has become necessary.

Property ownership in some forms is as old as the human race, or somewhat older. But the property that our precivilized ancestors owned was all personal property, not capital property. Individuals owned things that they planned to use and enjoy themselves: clothing, tools, weapons, food stores, maybe a tent or a place in the communal dwelling. But no individual owned the land from which all these things came. An individual hunter could own the meat from his own kill, but not the hunting ground. The same hunter could own the spear he used to kill his prey, but not the flint quarry that its spearhead came from. Land was different from other types of property in that it was used to make wealth, rather than being wealth itself. In precivilzed society, it was the property of the band or the tribe, not of any individual. Any property that a person owned, he owned because his own work had made it, or because he had traded something produced by his own work for the product of someone else’s work.

Let’s look a bit more closely at that paradigm of property, because it contrasts greatly with what obtains today.

The source of wealth (the land) is owned communally.

The land is available to anyone in the band or tribe that is capable of making wealth from it.

If a person makes something, then (subject to tribal rules about sharing food and other necessities to make sure no one goes hungry or otherwise suffers unnecessarily) that person owns it. Labor defines ownership.

Private ownership of capital property was introduced with civilization. It created a very different paradigm of property ownership that worked like this.

The source of wealth (the land, and later on industrial plant and sometimes intellectual property) is owned by individuals.

The land and other capital property are only available to make wealth from with the permission of its owner.

If a person makes something, then (subject to laws which take a portion in taxes to cover public expense) it belongs to the owner of the capital property from which it is made. Labor does not define ownership. Ownership of capital property, and nothing else, defines ownership of the wealth produced from it.

Note the difference? When capital property was communally owned, it was labor that defined the ownership of wealth. Each person owned what he worked to produce. But since capital property has become privately owned, that ownership is now what defines ownership of the wealth produced from it. Today, no one owns what he works to produce, at least not because he works to produce it. Ownership is defined by ownership itself. To own capital property is to own what is produced from it, whether you do the work to produce it or someone else does. If you own capital property, that entitles you not only to the fruits of your own labor applied to that property, but also to the fruits of other people’s labor applied to the same. If you do not own capital property, then you are not entitled even to the fruits of your own labor.

This may be counter-intuitive, so let me go into a little more detail. Some may respond: aren’t people paid for their work? Don’t they own the fruits of their labor in the form of their wages or salaries?

No. They do own their wages or salaries of course, but that is NOT the product of their labor. That is the fee paid them for doing the work even though someone else owns the product of their labor. The product of a person’s labor is the goods or services produced by it, and that belongs not to the worker, but to the owner of the capital property the worker used to produce it. What’s more, it is always worth more in sale value than the wages paid those who produce it. As an employee, you are paid only a portion of the value of what your work produces – as small a portion as your employer can pay and still get you to do the job, and certainly never equal to the full value.

This brings us to the related concept of profit. What is profit? It’s defined as the revenues generated by a business minus its expenses. It may also be regarded as the net share of wealth going to the owner of capital property. Or, less even-handedly, it is that portion of the total wealth of an enterprise that the owner skims from the labor of others.

To make this clear, I’m going to exercise a bit of author privilege, or linguistic irresponsibility, and slightly redefine the word. (I have no shame. It’s true. Ask anyone.) For purposes of this writing, “profit” applies only to that portion of a business’ net revenue that is not produced with the owner’s own labor. This means that if you are the sole proprietor of a business with no employees, your business makes no “profits” in this sense, because your labor and no one else’s has generated the goods or services which have been sold to generate revenue. I’m doing this because I want to illustrate something about the great majority of business profits in our economy, which is however not true of situations such as I just described.

Profit, then, as I am using the word, is wealth amassed through other people’s work.

It is in this sense of the word “profit” – although I must emphasize that the vast majority of what accountants call “profit” does meet this definition – that profit is theft. It is the producing of wealth through the labor of other people, who are paid less than the value of the goods and services their labor produces. The owners of capital property – property which, in the natural state that our ancestors occupied for over a hundred thousand years, many times the duration of civilized life so far, was owned communally and not the property of any one individual – are taking wealth that other people have produced, and that in a natural society would belong to the people who produce it. And that is stealing.

But this act of theft is perpetrated by almost all owners of capital property without a shred of guilt, with even less shame than I feel in redefining a word here and there, because it has become endemic in our society and perceived as the natural order of things, no matter how unnatural it actually is. And it is completely unnatural, in two ways. Not only have we redistributed capital property, which in our original, natural societies was held in common, into private ownership, but we have also changed the rules about who owns what is produced from it, so that ownership rather than labor determines ownership. In natural, precivilized societies, capital property was owned by the society, but the society did not own the wealth that was produced from it. The individual that did the work owned the product of his work. (Subject, of course, to rules distributing food to the hungry and such, but that’s functionally equivalent to taxes today, and is a footnote to the process not the main description.) So not only have we gone from an arrangement in which capital property is publicly owned to one in which it’s privately owned, but at the same time we’ve gone from a system in which labor defines ownership to one in which ownership defines ownership. We have done this, obviously, to benefit the owners of capital property, who enjoy enormous privileges both economic and political in a modern society.

As noted above in the first paragraph, I’m not proposing any particular action here. We are long past the time when we could restore communal ownership of capital property, or at least I can’t think of any way to make that work in a modern industrial economy. Then again, perhaps there is a way and I simply haven’t thought of it. Certainly it’s a millennia-old Gordian knot of privilege and power, not easily undone. But the mind is as sharp an implement as Alexander’s sword, and merely to recognize the reality of what is and why serves by itself to put things into a new perspective. Also, there are some consequence of this recognition that all for-hire workers are being systematically plundered by a system designed to create and reward privilege which will be explored in future posts. At very least, this perspective will hopefully give many people the idea that things which have been taken for granted should be changed, which is a prerequisite to the consideration of exactly what they should be changed into.

Next week: slavery, serfdom, and wage work, or, the forms of coercion.

http://www.smashwords.com/books/view/8357

Sunday, March 28, 2010

This Is No Time For Compromise

Can we now dispense with the word “bipartisanship” now?

We are in a Crisis era, a Fourth Turning. Roughly once a lifetime, we go through a period of civic upheaval in which our national institutions (political and economic) have, for one reason or another, become dysfunctional. The last time this happened was in the 1930s-40s with the Great Depression followed by World War II. The time before that was in the 1860s-70s with secession, the Civil War, and Reconstruction. The time before that was in the 1770s-80s with the American Revolutionary War and the framing of the Constitution. You can find out more about the concept at this web site: http://www.fourthturning.com/. But what I want to write about today is not the overall concept of the generational cycle and the Fourth Turning. I want to talk about a specific characteristic that all Fourth Turnings have, this one (so far) included. That characteristic is divisiveness. It’s something that is often decried, but it is in fact a good thing – indeed, an absolutely necessary thing.

A Crisis era (such as this one) is a decisive time. It’s a time when much-needed reforms are put in place, reforms that have been neglected for decades. It is not a time for compromise or soft talk or middle courses. It’s a time when consensus cannot be achieved, when conflict arises between those who see a need for the new and those who would preserve the old, however dysfunctional it may be. It is the nature of such a conflict that it cannot be resolved through agreement. There must be a victory, and there must be a defeat. Consider the three Crisis eras from our nation’s past, beginning with the American Revolution.

In 1773, tensions had been rising between England and the American colonies for decades. The expensive conclusion of the Seven Years War (or French and Indian War to Americans) moved the British government to try to get the colonies to contribute financially to their own defense. A reasonable request, of course, but it ran head-first into the colonists’ conviction that they had come to America in the first place in search of self-rule, and that Crown and Parliament had no proper sovereign authority over America. London’s position was diametrically opposed: the British government insisted on its right to govern all British territory, including the colonies in America.

This impasse had grown over time. Prior to the French and Indian War, the British government didn’t really make any attempt to govern the colonies. Britain used America as a convenient dumping ground for condemned criminals (as she would later use Australia), a source of raw materials, and a market for manufactured goods, but otherwise left the colonists to their own devices. Thought in Britain had always held that the Crown and Parliament held sovereignty and the right to govern, but why bother? As American society became more developed and sophisticated, though, as population grew, and as the war with France forced Great Britain to take an interest in (and spend more money on) America’s defense, the attitude of the British government and that of the Americans approached collision.

A number of taxes were imposed on the colonists in the years following the end of the war, provoking a storm of protest. The government backed down and repealed most of these taxes by the early 1770s, retaining only a token duty on imported tea.

Was the tea tax onerous, an unconscionable burden threatening to reduce Americans to abject poverty? Certainly not. It was barely a tax at all. It would fall short of paying for the French and Indian War by millions of pounds. Most Americans would likely shrug their shoulders, pay the duty, and hardly notice. But the Tea Act, if allowed to stand, set the precedent that Parliament had the authority to tax the colonists and to legislate in other ways. Rather than accept this, a radical group led by Samuel Adams engaged in a bit of guerrilla theater, nonviolent civil disobedience, and applied vandalism, and destroyed a cargo of tea in Boston harbor.

This was not a move intended or calculated to provoke compromise. In response, the British government didn’t compromise, either. It imposed a series of Punitive Acts (or “Intolerable Acts” as the Americans called them) which further roused the Americans’ ire. Americans began forming militias and stockpiling arms and ammunition. The Crown dispatched reinforcements to America and negotiated with the German principality of Hesse for mercenary troops. The Americans formed a provisional government and appointed George Washington commander of its newly created army, which set about besieging the British forces in Boston. Battles were fought. Washington’s forces outmaneuvered the British at Boston and forced them to withdraw. The British thereafter returned the favor at New York City and nearly (but not quite) destroyed the Continental Army. The Congress passed a motion to declare independence from Great Britain. From that point on, the lines were drawn and no compromise was possible. Either America would become fully independent of Great Britain, or the colonies would submit to British rule, but the prior condition of loyal but self-governing colonies would cease to exist, one way or another.

Does this begin to sound familiar in terms of our current situation?

We can also compare it to what happened in the 1860s. Tensions had been building over issues related to industrialization of the country, particularly slavery, for many years. The territories acquired during the U.S.-Mexican War were a focus for much of the argument, since they would eventually become states and their representatives in Congress would weigh in on one side of the divide or the other. The newly-formed Republican Party represented the interests of the northern capitalists and of the abolitionists (who were in agreement over the specific issue of slavery; both opposed it although for different reasons). A moderate Republican, Abraham Lincoln, was nominated for president in 1860. Lincoln was not proposing to outlaw slavery, but did propose to keep it out of the new states formed from the western territories. This would, over time, result in an anti-slavery majority in Congress, and the planter interests saw the writing on the wall.

A true compromise on the issue of slavery would have resulted in gradual emancipation with compensation paid to the slave owners for loss of their property, but the hard-liners were not interested in that on either side. Southern fire-eaters saw an opportunity to provoke secession from the U.S. by states that permitted slavery. The strategy for this was to ensure a hard-line pro-slavery Democratic candidate in the election. Moderate Democrats held their own convention, with the result that the party split and nominated two competing presidential tickets, both of which lost (predictably enough) and Lincoln won with a plurality of the popular vote, exactly as the fire-eaters had intended. Seven states promptly seceded. Lincoln initially attempted a compromise solution and peaceful rejoining of the Union. The seceding states were having none of it. They formed a new central government with a Constitution modeled on the one they had abrogated (with a few appropriate changes) and, in a dispute over a federal fort within the borders of one of the seceding states, went to war.

Once again, an irreconcilable conflict existed. The southern planters wanted to preserve an antique way of life based on wealth generated by growing cash crops with slave labor. The northern commercial and industrial interests wanted to pursue an increasingly mechanized and industrialized future in which slaves would be replaced by machines and finance capital would dominate the entire economy, and the emancipationists, their temporary and ad-hoc allies, wished to free the slaves for moral reasons. A solution might have been found short of war, but it would have required the planters to accept defeat and seek the best compromise deal they could get. They were unwilling to do that. And so the lines were drawn once again, and the conflict fought to the finish.

The Great Depression was less violent, but no less uncompromising. A breakdown of the capitalist economic system with its governing philosophy of laissez-faire left some 25% of the workforce unemployed. Neither the breakdown nor dispute over that philosophy was new; the industrial economy put in place after the Civil War suffered periodic financial panics and depressions roughly every 20 years. The philosophy itself was opposed by labor union activists, anarchists, socialists, and Communists. Class conflict had been intensifying for decades. The Depression brought it all to a head. Herbert Hoover, the president when the economy tanked, was no laissez-faire purist, or so one would judge from his past. But he moved in that direction in the face of disaster, perhaps out of genuine conviction or perhaps because the Republican Party demanded it of him. The conflict this time was political and electoral and did not involve guns (which we may take as a sign of progress), but it was no less decisive. Over the years of Franklin Roosevelt’s presidency, laissez-faire was abandoned. The workplace was unionized, the government regulated the banks and other industries, and the first social welfare programs (Social Security and unemployment insurance) were put in place. By the time World War II was over, a new economy had been crafted, a mix of capitalist and socialist elements. This was not accomplished through bipartisan compromise any more than the changes of the American Revolution or the Civil War were. The divide was sharp and partisan, with the Democrats on one side of it and the Republicans on the other. The Democrats won, and the Republicans lost.

In the present time, we again face a situation similar to those three. The economy has again broken down, although not as severely as during the Great Depression. In addition, we face shortages of key raw materials and severe environmental dangers. The problems this time are global in scope. The global economy is beyond the power of any one national government to regulate – an international means of regulating it is required. One economic problem that was not present in the 1930s was a shortage of fuel; the U.S. was still a net exporter of oil then. Today, we are faced with the need to transform our energy economy away from its dependence on oil – no easy task. We cannot simply apply the same methods that worked in the Depression, despite a superficial similarity.

On all of these points, we do not find national unity. There are voices on the other side, claiming that the problems don’t exist, or that we can solve them without changing the way we do business. In many cases, these voices are cynical and insincere, acting not with genuine public concern but out of a desire to protect private profits. We saw how fiercely the lines were drawn over the health-care reform debate. This is the template for the next few elections. A compromise, “bipartisan” solution will, almost by definition, be an unworkable one. We must accept that the conflict exists. It’s too soon to broker a negotiated settlement. First, we must win. Then we can make peace.

I hope – and given the example of the Great Depression, I cautiously believe – that I speak of “winning” and of “peace” only in metaphor. Some violence, however, has already occurred. It remains to be seen whether those who are defeated at the polls (rather, some of their crazier supporters) will resort to the cartridge box instead of the ballot box. Let us pray not. Such efforts would of course be defeated, but in the course of it lives would be lost for the most futile of causes. In that sense, I hope that we have peace now, not after victory. But at the same time, we cannot let the danger of violence deter us from doing what must be done.

In any case, it’s time to jettison the search for “bipartisanship.” There will come a time later on, after the necessary reforms are in place and their opponents have accepted reality, when consensus may be sought once more. But that time is not now.