Thursday, December 29, 2011

Mysticism, Myth and Make-Believe

Religious teachings and ideas consist of three things: mysticism, myth, and make-believe. Or, as I've said somewhat less precisely in another context, inspired wisdom and comic-book stuff.

Where does religion originally come from? Of course one may provide a cynical answer to this question; one may assert that religion comes from a desire for power on the part of a priesthood, or from a desire to explain the unexplained, or from a desire for immortality or fear of death, or from a desire for certainty in an uncertain world. But while all of these are factors in determining the beliefs that make up a religion, there is one other element without which religion wouldn't even exist: mysticism.

By "mysticism" I mean the direct personal experience of -- well, of those things that mystics experience. Call it the Underlying Reality, or UR for short. Any words I might use to describe exactly what the UR is would be metaphorical at best and misleading at worst. If you reading this have undergone mystical experience, you know what I mean. If you haven't, unfortunately, I can't tell you. But there are states of consciousness which the human mind can achieve either spontaneously or by various methods, and in which one comes to an intuitive understanding regarding one's identity and place in the cosmos. Mystical awareness has been called many things: communion with God, union with God, communion or union with the universe, eradication of the ego or of the self, awakening from sleep or from a dream, penetration of the illusion to find reality. All of these are metaphors, which is why I'm using a vague term here like UR, which really doesn't mean anything and so, if it fails to inform, should at least also not confuse. Again, if you've been there, you know.

One thing that has sometimes happened is that mystics have felt a compulsion to communicate their teachings to other people. I guess most of us go through that desire at some point or other (and look, here I am surrendering to it yet again). Something interesting happens when they do. Two interesting things, actually. The first is that hardly anyone understands them (only other mystics, who don't need the instruction, can really comprehend it). But the second is that what they say often resonates with a kind of unconscious awareness we all have. It's as if the understanding of mystics is stored inside our brains where we can't normally get at it, and pops up to say "Hey! Here I am!" whenever it gets any encouragement. And so when a person reads a parable of Jesus from the Gospels or the teachings of the Buddha from the Sutras or the Tao Te Ching by Lao Tze or some of the more seminal passages of the Baghavad Gita, the thoughts expressed in those words touch off little explosions deep inside the soul. And so the mystics attract followers who don't really understand what their teacher or guru or whatever is talking about, but know that they like it and want to follow it.

And then another thing happens when the mystic teacher dies, as is of course inevitable. The written teachings are then all that's left, and there is usually no one around anymore who really understands them, but there remain enthusiastic followers who want to believe. And that's where all those other contributions to religious thought come in: the desire for power, to explain the unexplained, to deny death, or to achieve certainty.

Now, within most religions or at least within an esoteric branch of them one can always find a framework for pursuing mystical enlightenment, together with techniques for achieving it. Since it's only possible to know the UR by personally experiencing it, this sort of instruction within a religion -- instruction as to how to personally experience it -- is the only religious knowledge that can be conveyed directly and straightforwardly. That's mysticism: the first category of religious teaching.

One can also find plenty of myth, and by that I mean ideas and stories that provide metaphorical descriptions of the UR or some aspect of it. Myths, like the teachings of great mystics, resonate in the brain's hidden recesses. But we must always remember that myths are metaphors; its their resonance with the buried mystical awareness that's important, not their literal truth. The central Christian myth of the Resurrection is a perfect example. Many Christians actually believe that Jesus literally rose from the dead; I regard that as extremely improbable for obvious reasons, but that's not really the point here. The point is that the Resurrection as myth is more important than the Resurrection as fact even if it IS fact. What this story says about the process of awakening and the experience a mystic goes through to come to that point is what's important here. And the same is true of other myths in other religions. It's not important whether the Buddha was actually raised with such imposed naivety that he never saw sickness or poverty or death until he became a teenager and jumped over Daddy's royal wall. (Yeah, like he never suffered a childhood disease . . .) It's not important whether his mother had a dream in which she was impregnated by a white elephant. It's not important whether Moses really brought monstrous plagues down on Egypt or caused the sea to part or obtained God's will inscribed on stone tablets by the divine hand itself. What's important in every case is the symbolic power of these stories, the way they resonate (once again) with that hidden knowledge we all carry.

The final category of religious thought is make-believe. Now, make-believe can have the same contents as myth sometimes. In fact, that's very often the case. Those that believe a myth to be literally real and (more importantly) make a big deal of this, so that it sets their religion apart from all others, are engaging in make-believe.

Make-believe in religious thought and teachings generates a narrative that aggrandizes a religon's power and importance.

God, the creator of the universe, selected one particular tribe of humans as his particular servants, and expects more of them than from others, and visits them with blessings when they live up to these expectations and with tribulations when they fall short.

God sent his son to sacrifice himself for the sins of the world, so that those who follow his religion can be saved from hell and achieve eternal life in bliss.

God has addressed mankind via a series of prophets, and we who follow the last of the prophets have his real, true teachings, all previous prophetic teachings having been either superseded or corrupted or both.

See the pattern? In reality, awareness of the UR is something that all human beings have as a potential. It's there, separate from any religion, ready to guide and lead. The experience of finding it is something often hinted at in the teachings of all religions, and all religions should consider themselves to be signposts pointing the way towards the reality at which they can only hint. When all religious teachings are properly understood to be what they are -- myths and metaphors -- how can any religion ever claim to possess THE truth, or to be true while all others are false? Literally speaking, there is no such thing as a "true metaphor." (Or, of course, a false one.) When a religion takes this kind of humble approach and accepts that its teachings are not THE truth, but only one version, one myth, one pointer towards the knowable-but-not-tellable, then it will leave make-believe behind and deal only in mysticism and myth.

But when it obsesses over the make-believe aspects of its teachings, then it ceases to be a guide to the UR and becomes a barrier between it and the believer. And that is also when it becomes potentially something dangerous.

No matter how sophisticated our knowledge of the universe becomes, there will always be a place for both mysticism and myth. But there really should be no place for make-believe outside of fiction.

Wednesday, December 21, 2011

Time to Amend the Constitution

Blocking the port operations on West Coast is all well and good as a show of strength and a demonstration that the Occupy movement still breathes, but otherwise it is only tangentially related to the purposes for which the movement exists. That probably explains why the participation in the activity is small compared to earlier efforts. There is, I'm sure, considerable ambivalence about it.

What we need to consider is an action that will be more than symbolic, and I think I know exactly what it should be: a state-by-state petition and call for a new Constitutional Convention.

Article 5 of the U.S. Constitution stipulates that 2/3 of the state legislatures may call for a convention to propose amendments to the document. This has never been done before, except for the original convention that drafted the original Constitution itself. But given the Supreme Court's rulings, plutocracy will prevail until the Constitution is amended to break the false equivalence between money and speech, and a corrupt Congress is most unlikely to pass such an amendment with the 2/3 majority of both houses required. The other method of amending the Constitution, a convention called by the state legislatures, bypasses Congress altogether.

In formulating the petition, we should also make sure to specify how the delegates to the Constitutional convention are to be selected. It might suffice for the state legislatures to select delegates, although the ideal method would be through direct election. Most certainly the delegates should not be selected by the U.S. Congress! And we should also insist that a money-is-not-speech amendment be among the issues addressed by the convention, in specific terms.

A resolution of this nature, passed by 2/3 of the states, would be a much better demonstration of Occupy's influence than closing down the nation's ports.

Monday, December 5, 2011

The Premature Implosion of the GOP

This is an interesting election season that hasn't started yet. The important action of this election in its primary and caucus phase will of course be on the Republican side, as there is an incumbent Democratic president who has a lock on the nomination. So the fact that the Republicans are in the election news at this point is no surprise. What is rather surprising and, if I'm not mistaken, unprecedented is that the nomination is being largely decided before the first primary or caucus has been held.

Over the past few months, we have seen several Republican candidates rise in the polls only to sink again as either weird policy positions, poor debating performance, or skeletons emerging from their closets have lost them support among Republican voters. First it was Michelle Bachmann. She was riding high for a while, but doubts about her ability to handle the economy and the entry of Rick Perry into the race dramatically cut her support, and at this point she has little chance to win the nomination.

After Bachmann came Perry. A meteoric rise in the polls was followed by an equally-rapid fall as Perry revealed policy positions that were not in line with what Republican voters were looking for and displayed a truly clumsy and lamentable performance in the debates. Now Perry has sunk, too, but his fall has not resulted in a restoration of Bachmann as the political right's darling.

Instead, the focus moved to Herman Cain. Again, briefly, Cain was the man of the hour (the problem being that the hour occurred LONG before the nomination is to be officially decided). Cain has fallen, too (and unlike the others has actually suspended his campaign), partly because his gimmicky 9-9-9 tax plan looks awful to the degree people understand it, partly because of sexual indiscretions in his past.

Now, the white hat seems to be worn by Newt Gingrich, who has yet to implode and fall. But at this point, the first caucus of the campaign season is still almost a month away (Iowa is scheduled for January 3, 2012.) So there's still time for Gingrich, too, to self-destruct.

The Republican Party seems to be self-destructing, but what's particularly interesting to me is that this is happening so prematurely. The primary season hasn't even started yet, but the party is looking over and discarding candidate after candidate. It's as if the real decision occurs ahead of time through some other process, and the actual primaries and caucuses serve only as validations of that prior decision-making.

And that is in fact what I believe is happening. The decision-making is taking place largely over social media and Internet conversations among Republicans, in a form of Internet-based direct democracy. This is a variant of the same process that has given us the Arab Spring, the Occupy movement, and the amazing volatility of the stock markets over the past year.

What I believe we are seeing is a fundamental transformation in the way decisions are reached in our society. Institutions and values, culture and politics, all of these follow the lead of technological change. All adapt to material circumstances, and those circumstances are themselves altered by technology. When it comes to collective decision-making, communication technology is what's important. A simple and deceptively innocuous invention, the printing press, led first to a rebellion against the Catholic Church that split Christendom to the core, and later to a massive, wide-spread movement across Europe and America to replace monarchies with democratic republics. All this just from a technique for cheaply reproducing written material -- which made widespread literacy economically feasible and led to demands first for the right of believers to read Scripture for themselves, and then for participation by literate citizens in the political process.

The Internet is a similar widespread change, dramatically accelerating the ability of people to communicate and interact. That's especially true of social media, which permit two-way communication on a scale and at a speed unprecedented. This change increases the ability of people to reach collective decisions independent of official authorities and political processes -- and thereby increases the capacity for direct, as opposed to representative, democracy.

I've written a pamphlet on the subject of direct democracy on a national scale which you can download free from here. But perhaps the larger transformation is what is happening not on this scale, but under the hood. The Republican Party's voters are making their decision about their 2012 candidate before the year 2012 even begins, and they are doing it on line in discussions about the candidates outside the official channels.

Wednesday, July 27, 2011

The Roman Republic and the Three Forms of Government

Although there are, in one sense, many different forms of government – parliamentary democracies, unitary republics, federal republics, dictatorships, constitutional monarchies, absolute monarchies, oligarchic republics – in actual practice it comes down to just three archetypal forms, as one might put it. All real-world governments can be described in terms of how much they lean towards one or another of these ideal archetypes. The forms are monarchy, aristocracy and democracy. The key dynamic of all politics is the desire of aristocrats (with or without titles) to maximize their own power. Both monarchy and democracy are opposed to their doing this, and so aristocrats strive to limit the power of both kings and people, and to gather all control of the government and of life itself into their hands. In American national patriotic mythos (America being founded by an aristocratic rebellion against a monarchy), kings are portrayed as the enemies of popular liberty; in reality, though, while it’s true that kings, monarchs, dictators, absolute rulers in general, can at times be despots, for the most part they are a danger to aristocrats, not to the common people, who have more to fear from aristocrats than from kings.

As it happens, history holds a wonderful real-time example of how these three forms mix and what lessons we can learn from them.

That real-world example is Rome, which in its long life prior to the collapse of the Roman Empire was first a monarchy, then a naked aristocracy (very briefly), then an aristocracy with democratic pretensions, then a monarchy again. Rome does not give us anything approaching real (as opposed to pretend) democracy, but for the purposes of this writing that’s actually a benefit. Let me explain what I mean.

Rome began life as an agricultural and trading city-state on an Italian river and on a trade route for salt. Like most city-states, it was ruled by a monarch, but in the case of Rome, for a long time the monarch was a foreigner, an Etruscan, which nation had somehow established political control over central Italy, including Rome. The details of this we don’t know, but the usual structure of government in the city-states of the ancient world consisted of a class of hereditary nobles, whose wealth came from land holdings and who traditionally served as the city-state’s elite warriors and war-leaders, with a king at the top of the governing structure. That was the case with Rome, as we can see from the institutions which survived the overthrow of the monarchy. The King of Rome was advised by the Senate, which consisted of the heads of the great aristocratic families, originally an even one hundred in number. (It was increased in size later on.) The Senate possessed considerable power and authority, as did each Patrician (the original Roman nobility) in his own lands and over his own people. But the authority of the King served to moderate and mitigate that of the aristocrats.

There are stories told about the heinous tyranny of the seventh and last King of Rome, Lucius Tarquinius Superbus, which may or may not contain elements of truth. What is undeniably true is that, whether they were goaded by the rule of a particularly offensive monarch or for some other reason, towards the end of the sixth century BCE the Roman aristocrats overthrew their King and created a new government, the Roman Republic. (The conflict of interest between monarchs and aristocrats is illustrated by the stories of King Tarquinius. It’s said that he murdered members of the Senate whom he suspected of supporting his predecessor, King Servius Tullus, who had usurped the rule after the death of Tarquinius’ father, King Tarquinius Priscus, and failed to replace the murdered Senators with new members. It’s also said that he ruled without properly consulting the Senate, as an absolute monarch or, in the strict sense of the word, a tyrant. And finally, there’s a story that the King’s son raped a noblewoman named Lucretia, who then committed suicide, driving her widowed husband and her family to overthrow the king out of personal outrage.)

The story of the Republic’s founding, complete with the alleged perfidies of King Tarquinius Superbus, is presented by Roman writers and those sympathetic to the Republic in later times as a triumph of liberty. The reality is not so simple. We must always ask ourselves two questions. Liberty for whom? Liberty to do what?

What was the governing structure of the Roman Republic actually like? As we shall see, it was a structure designed to maintain aristocratic rule. To that end, it contained both anti-democratic and anti-monarchical structures. Some of its structures served to concentrate power in the hands of the nobility and exclude it from anyone outside those ranks, or from the common people as a whole. Other structures served as checks and balances designed to prevent any one aristocrat – as opposed to the class of aristocrats – from gaining too much power. If the people became strong, if a genuine democracy was created, the aristocrats might lose their power and privileges in the face of popular resentment. On the other hand, if any one aristocrat became too strong, he might make himself a king, and restrain aristocratic rapacity from above. The Republic, in short, was not designed to protect the liberty of the common people. It was designed to maximize the privileges of the aristocracy, and to subjugate and plunder the common people not to protect them. That was the reality, and any claim to the contrary was mere pretense.

The official or at least semi-official title of the Republic was “Senatus Populusque Romanus,” or “The Senate and the People of Rome.” The Senate continued to exist in more or less the same form and with more or less the same powers as it had possessed under the monarchy. The People were a new creation, or perhaps an evolution of something that had existed in embryonic form under the monarchy. Note the capitalization. The People of Rome were not the same thing as the people or Rome or the citizenry of Rome. It was a formal body, a name for four different legislative and judicial Assemblies that passed the laws and conducted many, although not all, trials. These Assemblies were, from oldest to youngest, the Curiate Assembly (Comitia Curiata), the Assembly of the Centuries (Comitia Centuriata), the Assembly of the People or of the Tribes (Comitia Tributa) and the Assembly of the Plebeians (Concilium Plebis) In addition to the Senate and the Assemblies, the Republic included various elected magistrates who presided over the day-to-day governing of the city.

The Curiate Assembly was really important only in the first twenty years of the Republic, before the Plebeian Revolt. In the beginning, this Assembly passed all the laws and elected the consuls (at this time the only magistrates, later on the most senior magistrates), and conducted trials. It consisted only of aristocrats, and so was a true, naked aristocratic body of rule. Here was the actual motivation behind the Roman revolution, blatantly – perhaps too blatantly – on display. Within two decades after the founding of the Republic, in the face of plebeian unrest, most of the powers of the Curiate Assembly were transferred to the other Assemblies, which (as we shall see) were in practice just as lopsidedly aristocratic, but could more successfully pretend otherwise.

The Assembly of the Centuries elected the highest magistrates, the consuls and praetors, after this power was transferred to it from the Curiate Assembly. It also conducted trials for high treason and could (but rarely did) pass legislation. In this Assembly, the voters were divided into economic classes, with the richest Roman citizens in the First Class, and those of lower means assigned to the Second through the Fifth Classes, while the poorest were denied any vote at all in this Assembly. Each Class was further divided into Centuries (hence the Assembly’s name), which for the First Class actually consisted of 100 men each. (Women could not vote. There were very few, if any, ancient societies that practiced voting that granted the franchise to women, so that was not especially noteworthy on the part of the Romans.) The number of men per Century increased through the classes, so that a Century of the Fifth Class consisted of many more men than one of the First Class. When electing magistrates or voting on a law, each member voted within his Century, and the majority within the Century determined the vote of that Century. Because of the variation in Century size from one Class to the next, the very wealthy Romans (which of course meant the aristocrats) had a more influential vote than the poorer citizens. Voting in this Assembly proceeded from the Centuries of the First Class on down. Most of the time, a majority of the Centuries was achieved by the time the first two Classes had voted, so that in practice very few Romans had a vote. But in theory, any Roman citizen except the very poorest had at least some voice in the Assembly, so, unlike the Curiate Assembly, it possessed a veneer, a pretense of democracy.

The Comitia Tributa or Assembly of the People had a different structure. It was not organized by wealth. Instead, each citizen belonged to one of the thirty-five “tribes” of the Roman people. But the common people were almost all lumped into four huge “urban” tribes (in fact, most of the common people were in just two of them), while the aristocrats were distributed across the thirty-one “rural” tribes. In elections and legislation, each citizen voted within his tribe, and it was his tribe’s vote, as determined by the majority of its citizens, that actually counted. What this means is that the common people had just four votes, while the aristocrats had thirty-one. But here again, a veneer or pretense of democracy existed, because every Roman citizen could vote. In fact, even the poorest Romans had a vote. It just didn’t count for anything.

The Concilium Plebis or Plebeian Assembly had a structure very much like the Assembly of the People, except that it excluded the Patricians from voting. The Patricians were the original Roman aristocrats. One could be a Patrician only by right of birth or adoption, and Patricians could not vote in the Plebeian Assembly. In the beginning, the Plebeian Assembly had no legislative powers; after the Plebeian Revolt, however, it was granted such powers – and at the same time, the nobility began to admit the very wealthiest of Plebeians into the ranks of the aristocrats. Such men were still not considered Patrician, but they could enter the Senate, could run for the magistracies, and had almost all of the same privileges as Patricians. And again, the rich, aristocratic Plebeians joined the Patricians in the thirty-one “rural” tribes, while everyone else still got dumped into the four “urban” tribes. Here we see the mechanism of control at its best and most deceptive. The Plebeian Assembly, in spite of these measures, could at times be a subversive, almost democratic body, but there were sufficient checks on its powers through cooption and rigged voting that most of the time it served to reinforce the powers of the aristocrats, even though many of those aristocrats could not vote in it.

These four Assemblies are what was meant by “the Roman People” in the official name of the Republic: not the collection of all Roman citizens in some democratic fashion, but rather the bodies that passed the laws, and were designed to maintain aristocratic rule.

The laws were all passed in one of the Assemblies or another. But the day-to-day running of the Republic was done by the magistrates and by the Senate. The Senate was originally restricted to Patrician membership, but at the same time as the rich Plebeians began to be coopted into the aristocracy, the door was opened for them to enter the Senate as well. Even so, the Senate remained a thoroughly aristocratic body, because only the richest of Romans could belong to it. Its decrees did not have the force of law, and could be overridden by a vote of an Assembly. But the Senate controlled the public purse, foreign policy, and military practice, and could direct the actions of the government on a regular basis as long as no Assembly actually did vote to override its authority. Here was the most aristocratic of the Republic’s governing institutions, with no pretense to democracy at all.

The magistrates were elected for one year at a time, and it is perhaps in them most of all that we can see the other side of the Republic’s protection of aristocratic power, because in them more than anything else resided the checks and balances that kept too much power out of the hands of any one man. Theoretically, any Roman citizen could run for any magistracy, but in practice most of the offices were reserved for aristocrats. Even the office of Tribune of the Plebeians was the prerogative of wealthy men and men with aristocratic names (although they could not be Patricians). As for the consuls, those at the head of the government, only the most august of families could in most cases stand any chance of running and winning.

But the power of any man who held high office was carefully circumscribed. The government was led not by one man but by two, both of whom held the title of consul. The term of office was only one year. By law, no man could run for consul a second time until ten years had passed since the last time he held the office. And the entire aristocracy would combine to oppose any man who rose too high and became too popular, too strong, too much a threat to dominate his peers.

So that was the Roman Republic: a veneer of democracy over what was in practice a completely aristocratic government. It was limited neither from below by democratic accountability to the people, nor from above by a king. From the standpoint of the common people, it was incredibly rapacious and led to the concentration of wealth in the hands of a powerful few, leaving most Romans with very little. (In fact, below the level of even the poorest citizen was an enormous number of slaves, the true lowest of the lower classes.)

Over time, as Rome expanded in power and the size of her empire, this structure of government ran into difficulties from both the lack of democracy at one end and the inefficiency involved in preventing too much power going to any one man at the other. One problem was that the conquests resulted in wealth flowing into the society, which concentrated into the hands of the aristocrats, further amplifying their power and the resentment of the common people. Revolts occurred from time to time, led by aristocrats who were either honestly motivated by a desire for social justice or willing to make use of popular resentment to advance their own power and fortune. The other problem was that emergencies arose that the Republic’s deliberately weak and inefficient central government could not easily handle. Whether these consisted of revolts in the provinces, military threats from abroad, or piracy on the seas, it became necessary to step outside the Republic’s constitutional restraints again and again in order to prevent the empire from disintegrating from within, being conquered by barbarians, or having its trade strangled. In the end, the two cracks in the system met and a very popular and liberal-minded leader of immense administrative and military ability – a Patrician, ironically enough, named Gaius Julius Caesar – established dictatorial rule, and after his assassination his chosen heir implemented a new monarchy.

Now this has been a long-winded discourse on the political structure and history of a nation that no longer exists, but it does have relevance for our own situation in America particularly – although in other countries as well – today. Although America has no titled, official nobility, it can’t be denied that we do have aristocrats. These aristocrats run the biggest corporations, dominate our government through bribery of elected officials, and impose a form of censorship through corporate ownership of the major media. And as always happens, too much power in the hands of aristocrats is bad news for everyone who isn’t one of them.

Like all aristocracies, ours is concerned with limiting the power of government, which is the only thing that can easily limit their rapaciousness on an ongoing basis. As with all aristocracies, ours is concerned with preventing both democracy – the exercise of government power at the behest of the people – and monarchy – the rise of a popular, powerful individual that can swing the state against the interests of the rich.

This is the real motivation behind the political attempt to cut down the size of government. It is not to protect the liberty of the common people. It is to protect the privileges of the wealthy, and reduce the rest of us to servitude. Because for ordinary people, it is not the king, let alone any truly democratic government, that is the enemy of freedom. It is the aristocracy.

It is the rich.

Wednesday, May 4, 2011

On the Death of Osama bin Ladin

Every victory should be regarded with sadness, like a funeral. The only thing worse is a defeat. Having no need for victory is the real victory. Victory is merely avoidance of the appalling in favor of the barely tolerable. Thus it is with the recent slaying of Osama bin Ladin. It's better that it succeeded than if it had failed. Victory is better than defeat. But only just barely. It cannot make up for the great tragedy that either one, victory OR defeat, had to happen. We should not have had to do it. And some thought should be given to how to avoid having to do it in the future.

I use the word "tragedy" here in the classic Greek sense: misfortune arising from one's own flaws. Osama bin Ladin was our nemesis, the fruit of our hubris as a nation. He should never have existed, and if we had not betrayed our own values and our own identity as a nation he never would have. Or if he had, he would have been someone else's nemesis, not ours.

There are so many intertwined threads of truth to the death of Osama bin Ladin.

Start with the surface. He was a violent, evil man who was responsible for the slaughter of thousands. His death is no great loss to the world. On this, most everyone agrees. The only exceptions would be those who share his particularly twisted brand of violent Islamic ideology. (It's "Islamic" in the same sense as the Christian Identity movement is "Christian." I call it that because I don't know what else to call it. Normal Muslims may take exception, just as normal Christians may take exception to having racist neo-Nazi monstrosities lumped in with them. I shall here merely note the likely unhappiness, acknowledge that very few Muslims bear much resemblance in their beliefs to bin Ladin, and move on.)

But that's just the surface, taking bin Ladin's death as if it existed in isolation. It doesn't, of course. It's the most recent significant development in a saga that has included a lot of stupid, inept, opportunistic, and downright wrong-headed moves on the part of the United States. Starting from the World Trade Center attack on September 11, 2001, our first wrong move was President Bush's choice of words to describe what we were engaged in: "War on Terrorism." (Or "War on Terror." It depended on his mood on any given day, or perhaps on how much he'd had to drink recently.) We cannot, of course, wage war on a military tactic, nor on a loose-knit criminal organization. We can fight them, as we speak of fighting crime. But this fight cannot be a "war." War is inevitably fought by armies and navies in service to nations, one national government against another. So that was our first mistake. We took a criminal act and improperly dignified it by calling it an act of war, as if al-Qaeda were a nation and Osama bin Ladin its government. This error of terminology -- if it was an error and not a brilliant and wicked deception -- led us to the invasion of Afghanistan and later of Iraq, to the slaughter of tens of thousands of innocent people and the deaths and maiming of thousands of our own citizens. None of these actions did anything much against the organization that attacked us in 2001. But because we were "at war," we sought enemies that could be vaguely connected with al-Qaeda (and whose conquest could prove advantageous either in geopolitical terms or in service to corporate bottom lines) on which to spend the might of our vast military machine, so much of which was useless against al-Qaeda itself.

At the same time as we emphasized the wrong targets, due to the illusion cast by that word "war," we downplayed the right targets and the right tactics. At one point, after bin Ladin escaped at Bora Bora, President Bush actually stated that he considered capturing or killing the al-Qaeda leader a low priority. What we saw recently was the success of an approach that should have been emphasized from the beginning: an approach that avoids being mystified by (or seizing the opportunity presented by) that misleading term "war."

So much for the immediate layer below the surface. But now let's dig a little deeper still. Why did Osama bin Ladin choose the United States as his primary target?

There was a certain amount of calculation in his doing so. Osama bin Ladin's long-term goal was to create a new Caliphate, uniting all Muslims under a single rule, a return to the Medieval greatness of Islam. (Preferably with himself as Caliph, one imagines.) The Muslim world is, of course, far from united. But one classic, time-honored way to unite squabbling peoples is to present them with a common enemy. By provoking the Untied States into taking ill-considered aggressive action in the Middle East, he hoped to enrage Muslims enough to have them set aside their differences in order to fight us. That didn't work as well as he'd hoped, but it explains why he launched the attack.

What it doesn't explain, however, is why he launched it at us. Why were we the right choice, the obvious choice, as the common foe of Islam? Why not attack some target in London, or in Tokyo, or in Brussels, or in Moscow? It doesn't take a whole lot of thought to arrive at the answer. America -- not Britain, Japan, the European Union, or the Russian Republic -- is the greatest of superpowers, the world's hegemon, the great power that must be defeated if Islam is to achieve greatness. America is the backer of Israel, the supporter of tyrants throughout the Muslim world, the new Rome.

And at root, that is where we went wrong, before bin Ladin was even born, and long before he launched his attacks in New York and Washington. That is why Osama bin Ladin exists, and why we had to kill him. Because we are not, in the national vision of our founders, supposed to be an empire, a superpower. We are supposed to be a land of liberty. We are supposed to be a democracy. And there is no such thing as a democratic empire. The two are incompatible, and one or the other must in the end be lost.

It's difficult for Americans nowadays to understand, because throughout my lifetime and for some years earlier we have had the world's most powerful military, so that it has come to seem normal. In reality, it is an anomaly of American history. Our nation has until the end of World War II always had a distrust of standing armies and a parsimony about military expenditure. We kept a small professional force, a cadre of officers, and when war loomed we would recruit or conscript an army around that tiny core and march off to face the enemy. During the major wars of our history -- the War of 1812, the Mexican War, the Civil War, the Spanish-American War, World War I -- we built powerful but temporary armies. When the war ended, the citizens who had rallied to the flag to meet the emergency laid down their arms and returned gratefully and happily to their civilian pursuits. The military budget shrank to nearly nothing, and so it remained during the years of peace, until the next war threatened. On the day the Japanese attacked Pearl Harbor, the United States had one of the weakest armies in the world.

As part and parcel of this, we went to war only rarely. Of the major wars the U.S. has engaged in, all but World War II were at the instigation of Americans themselves (the Civil War included, because the Confederates who started the war were Americans, too). Without a powerful standing army, we were seldom tempted to do this. Wars meant raising taxes, taking an economic hit, and of course sending young men off to die; they were not popular and we lacked the standing force to make the decision easier.

At the end of World War II, we had, once again, an enormous military force. It had been necessary to build this force in order to defeat the Axis, of course. But the common expectation was that, once again, as before, we would send all the boys home, and go back to our peaceful pursuits, retaining only that tiny cadre of trained military experts around which to build an army the next time war threatened. But for some reason things were done differently this time.

The Soviet Union presented a permanent enemy, a way to justify keeping a powerful military in times of peace. Why did we do this? It's a mystery to which there may be no one right answer. Maybe people in government genuinely believed in the Communist threat. Maybe it was the arms industry and others who profited off this massive government largesse. Maybe it was something hidden in the halls of power in Washington, desirous of empire and national power. Maybe it was a combination of all three. Whatever the motives, though, the actions in service to them are plain enough. We retained a huge military force. We built a chain of military bases all over the world. We supported puppet governments either to have allies in the Cold War or for economic reasons. We found ourselves continuously at war somewhere in the world. We were never, or almost never, wholly at peace.

We built a national-security apparatus, a government within a government, operating in secrecy, unaccountable to the voters, barely controlled by the President and not at all by Congress -- a clear violation of all the principles on which America is supposedly based. This is not new in the world, although it was new for us, and wrong for us. It's the way every empire in history has always operated. It's the way empires have to operate. Empire and democracy are incompatible. We cannot have both. That means that empire and America are incompatible. We cannot have both. We have become something other than America, something that our ancestors would look upon in horror.

In 1991, we were presented with a golden opportunity to set all this aside, bring the empire to an end, and become once more America. The Soviet Union, our opponent in the Cold War and the justification for empire from 1945 until then, ceased to exist. We could have shut down the bases, dismantled most of our armed forces, declared victory and gone home. We didn't. And that surely proves that by that time the empire was pursued for its own sake and the Cold War had become merely an excuse -- if it had ever been otherwise.

Today, we have a military force that costs as much as that of the entire rest of the world combined. We have hundreds of military bases in every corner of the world. We have the ability to invade any country on earth that we choose to invade, and we have arrogated to ourselves the willingness to use that ability whenever we choose, on whatever pretext we like, or on none. We have a government unaccountable to its people, that claims the authority to detain without trial, without rights, anyone -- citizen or foreigner -- that it labels as an "enemy combatant."

That is not America. It is the American Empire. And it was the American Empire, not America the land of liberty, that Osama bin Ladin attacked on 9/11/01. He was our nemesis, attacking in response to our hubris. The entire affair of the last ten years has been our tragedy.

Now he is dead. But the tragedy goes on, and will until the American Empire, too, is laid to rest.

Monday, September 27, 2010

Quote from a rich guy: "Tax me more."

Before presenting what is to follow, I have to apologize for neglecting this journal. My bad. I do have an excuse, mainly that I've been devoting my writing energy to fiction. I've finished the second novel in the Star Mages series and am getting it together pre-publication at this point. So that's good, but my feeling is that although it might seem like a decent excuse, I made a commitment here, I failed to keep it, and there is no excuse apart from physical or mental incapacity neither of which applies. (Yet. Knock wood.) :)

So I'll try to make up for that failure. To start with, I ran across an editorial in the LA Times by venture capitalist Garrett Gruener, who said some important things in it that people need to understand and, thanks to trickle-down propaganda, often don't. Here's the link to his article:

tax me more

Some excerpts that are particularly important:

"I'm a venture capitalist and an entrepreneur. Over the past three decades, I've made both good and bad investments. I've created successful companies and ones that didn't do so well. Overall, I'm proud that my investments have created jobs and led to some interesting innovations. And I've done well financially; I'm one of the fortunate few who are in the top echelon of American earners.

"For nearly the last decade, I've paid income taxes at the lowest rates of my professional career. Before that, I paid at higher rates. And if you want the simple, honest truth, from my perspective as an entrepreneur, the fluctuation didn't affect what I did with my money. None of my investments has ever been motivated by the rate at which I would have to pay personal income tax. . . .

"When inequality gets too far out of balance, as it did over the course of the last decade, the wealthy end up saving too much while members of the middle class can't afford to spend much unless they borrow excessively. Eventually, the economy stalls for lack of demand, and we see the kind of deflationary spiral we find ourselves in now. I believe it is no coincidence that the two highest peaks in American income inequality came in 1929 and 2008, and that the following years were marked by low economic activity and significant unemployment.

"What American businesspeople know, and have known since Henry Ford insisted that his employees be able to afford to buy the cars they made, is that a thriving economy doesn't just need investors; it needs people who can buy the goods and services businesses create. For the overall economy to do well, everyday Americans have to do well. . . .

"Remember, paying slightly more in personal income taxes won't change my investment choices at all, and I don't think a higher tax rate will change the investment decisions of most other high earners.

"What will change my investment decisions is if I see an economy doing better, one in which there is demand for the goods and services my investments produce. I am far more likely to invest if I see a country laying the foundation for future growth. In order to get there, we first need to let the Bush-era tax cuts for the upper 2% lapse. It is time to tax me more."

It's not surprising that a venture capitalist "gets it" about what limits investment in job-creating ventures: not availability of capital (i.e., not how much money rich investors have lying around), but expected return. What's more, the main thing that drives expected return is not how much the investor can expect to keep after taxes, but rather how much demand exists for the goods and services the investment is supposed to produce. As I said, it's not surprising a venture capitalist gets this; if he didn't know why he invests in one area rather than another, say in business rather than in financial instruments, he would not likely be successful at what he does. We ought to listen to him when he says things like this. I mean, when someone says, "Raise MY taxes," we can be pretty certain he's not speaking out of duplicitous self-interest. Unless he's a masochist or something.

But I'm going to take this argument one step further. Mr. Gruener says that small changes in his tax rate have no effect on his investment decisions. But what about big ones? What about the effect of the original Reagan tax cuts that dropped top marginal rates from the 60-70 percent range down into the 30s? On the other hand, what would be the effect of creating new tax brackets with very high taxes applied to very high incomes? What about a 95% tax on personal income over a million dollars a year?

Before continuing with this, maybe an explanation is in order about how "marginal" tax rates work. Right now, the top tax rate is 35% on incomes above $373,650. Does that mean that if someone makes $400k a year, he'll pay 35% of his income in federal income tax? No, it's a bit more complex than that. He'll pay that 35% only on taxable income above $373,650, which is to say, on $400,000 - $373,650, or $26,350. (That's if the $400k represents taxable income not total income, of course.) He pays at a lower rate on all the rest of his income. On the first part of what he earns for the year, he pays no taxes, just like everyone else.

So a 95% tax on income above a million dollars doesn't work to impoverish millionaires. What it does is to impose a personal income ceiling. Nobody is going to bother making over a million dollars in taxable income when Uncle Sweetie is going to make off with almost all of it. It won't hurt you to make more than a million (remember, all the income below a million is still taxed at the lower rates), but it won't help you much, either. So investors will stop investing in anything that would push their income above that point, and that will hurt the economy, right?

Well, not so fast. To begin with, most investments, and all of the ones we really want, are tax-deductible and so don't count as taxable income. If you start a business, most of the start-up costs are not taxed. (There are some exceptions involving heavy-equipment purchases, where the tax deduction is split over a number of years.) Wages you pay to employees are never taxed as your income. (As the employee's income, yes.) So what a confiscatory tax on really high income actually does is to give the person making that kind of scratch a really strong incentive to find places to invest that money where it will eventually pay off, but won't be taxable in the meantime. So -- provided we choose which investments to encourage through tax write-offs wisely -- this could actually spur investment rather than discouraging it.

Another consideration besides tax deduction is how quickly an investment pays off. The thing about investing in real business (that is, making stuff or providing services) is that it's a long-term project. You don't expect a quick payoff in the first year. Ask anyone who's ever started a business. You expect to lose money the first year, maybe the second year, maybe even longer depending on exactly what business you're in. Down the road, though, you do expect things to pick up to the point where you've recouped all those losses and made a profit. (It doesn't always happen that way, but you do expect it or you wouldn't have made the investment to begin with.) There are other kinds of investments, though, that can pay off very quickly. A good example is short-term trading on the stock market, where you're not trying to acquire stock for the long haul but rather to buy low and sell high, conceivably in a single day. Even better examples are the kinds of financial trading that resulted recently in the near-collapse of our financial system. To be sure, those particular investments went bad, but the point is that when they pay off they pay off quickly. That makes them preferable to investments in real business if you want a quick gain that can be reinvested for a multiplier effect.

What a confiscatory tax rate on very high income would discourage is this sort of investment. Why seek a quick payoff -- that is, a payoff this year -- if 95% of it goes to the federal government? Under that regimen, it makes a lot more sense to defer financial gains.

To illustrate, consider this. Let's say someone is making half a million normally. The person has another half million to invest. For sake of simplicity, he has two choices, either of which will return that half million and another million dollars on top of it. He can invest it in short-term financial manipulation that will give him the whole million and a half by year's-end. Or he can invest it instead in a start-up company making widgets, take a loss the first year, and recoup his investment plus another million over the next ten years.

If he chooses the latter route, he gets back an average of $150k a year, and in no year does the net return exceed $300k (let's say). Now: if he's going to see that investment return taxed at 35% no matter which way he goes, then he's better off investing in the short-term instrument. His net profit after taxes is $650k, and if he does the quick-return bit, he'll have all that to reinvest next year for more return still. But with the confiscatory tax in place, he'll be much better off investing in the slow road, because his net return with the short-term financial investment is only $50,000, while the return with the long-term investment is much higher, since none of it crosses that million-dollar line and so all of it avoids the confiscatory tax.

Bear in mind he's going to invest the money anyway. The only question is in what. Since investment in making things and providing services is what we want (that's what creates jobs), we want to encourage that and discourage the kind of investment that just plays with money.

Saturday, July 3, 2010

Prosperity: What's An Economy For?

I'm going to be writing a series on what I have come to call "Money-Free Economics." By this I don't mean economics of a barter system or of an economy without money; rather, I mean economics that ignores money and goes to the underlying real-wealth economy that money facilitates. I acknowledge up front that this creates a certain amount of distortion. There are features and processes of a modern economy that can't be understood without addressing money, among them interest rates, the effects of government fiscal policies, and speculative investment -- to name but three of many. But money also creates distortions. In particular, schools of economics that address money without touching on the underlying economy of goods and services often create severe distortions by treating money as if it existed and operated independently of the goods and services for which it is a token of exchange -- as if only money, not stuff, mattered. Moreover, those features of an economy that require addressing money to understand are already covered well by professional economists in their various schools. On these matters they don't require any help from me (often it's the other way around). But when economists present something as stupid as, for example, the laissez-faire interpretations of the Laffer Curve, or explanations for recession that rely entirely on monetary factors and ignore the distribution of wealth, I know that they have focused on money to the point where they have forgotten that it is just a token of exchange and not real wealth, because when you put those in money-free terms their nonsensical nature becomes obvious. So, to address the follies of economists and the politicians who quote them, I shall engage in an exercise, presenting economic concepts in ways that don't use money at all.

I'll begin today with an examination of what an economy is and what it's for in money-free terms.

An economy is, to begin with, a social arrangement. It involves assigning of ownership, division of labor, and rules of exchange and trade. In a modern society it is always a function of law. That wasn't always so, because human beings have not always lived under the rule of law, but even in pre-civilized times when there was no law as such and no formal government, there were still rules about who owned what, who was supposed to do what, and who got what in the end.

What this social arrangement is meant to do is to regulate and facilitate the production and distribution of wealth. Wealth, as I pointed out in the last entry, consists of goods and services. Going into a bit more detail, wealth consists of eight things: food, clothing, shelter, tools, toys, entertainment, advice, and assistance. Everything you or anyone else ever buys or sells falls into one or more of those categories. The economy is a social arrangement whereby these eight things are produced and gotten to the people who want and can use them. Those are the two criteria of economic success. As long as those eight things can be produced in enough quality and quantity and distributed to everyone who needs and wants them, the economy is a success. When either of these functions fails, the economy fails. If not enough food can be grown, or if the food that is grown can't be gotten to the people who need to eat it, there is famine. If not enough housing can be built, or if housing is built but sits vacant while people are homeless, there is a housing crisis. And so on.

Every failure of the economy, every depression, every recession, every instance of runaway inflation, every bubble collapse, even the economic failure that occurs after a military defeat, manifests ultimately in a failure either of production or of distribution or both. Even when the cause (or at least the trigger) of the economic problems is fiscal or monetary, such as a stock-market crash or the collapse of a housing or real estate or some other bubble, it always comes down in the end to a failure to produce or a failure to distribute. If it does not, then it is a nonexistent problem as far as the overall economy is concerned.

Problems can occur on either the production or the distribution side. An example of a production-side problem is a severe drought that results in crop failure. This creates a shortage of food and starvation. Another example is the devastation created by war, as for example in Germany during and after World War II, when Allied bombing and Allied and Soviet invasion destroyed German factories and industrial capacity, as well as German roads and railroads. A third example, more subtle, is the impact on the U.S. economy of the OPEC oil embargo from 1973 until 1983, which caused shortages of a crucial raw material. An economy that is in a pre-industrial state and is trying to industrialize also faces production challenges, not in the sense of losing production but in the sense of wanting to increase it. In general, production of wealth requires raw materials, labor, knowledge, and organization, and a shortage of any of these (for whatever reason) results in a deficit of production.

Problems of production are severe, but problems of distribution can be equally severe. The Irish potato famine was, at root, a distribution problem. It had a proximate cause on the production side, a potato disease that caused crop failures, but this would not have resulted in famine except that the Irish wheat lands were all in the control of aristocratic landholders who were entitled to the wheat crops for export purposes. That's the reason why ordinary Irish people were dependent on a potato diet in the first place. A more nearly equal distribution of Ireland's food crops would have meant that when the potato harvest failed, the people could eat other foods. Severe maldistribution of the nation's agricultural wealth meant that the potato blight became the potato famine.

The Great Depression and similar breakdowns in the years before it (for example the Long Depression that began in 1873 and lasted longer than the Great Depression itself, although it was not quite as severe) were also breakdowns of distribution. The economies of the advanced nations, such as the United States, suffered no shortages of raw materials, labor, knowledge, or organization, and there were initially no problems of production. But the goods produced were not distributed to the people who would use them. Because of the system of private capital property ownership, the goods produced in a factory (say) belonged to the factory's owner, and anyone who wanted those goods had to exchange items of value for them (by way of money, of course). Since not enough of the people who wanted the goods had the value to exchange for them, they could not be sold and so sat in warehouses being of no use to anyone.

The distorting effect of money can be easily seen in this entire sequence of events, which were caused by a desire on the part of capital property owners to keep to themselves as much of the wealth produced as they could. As long as we think in terms of money, this is perfectly understandable: the rich wanted to become richer. But if we think in money-free terms, the silliness of it becomes clearer. How much in the way of food, clothing, shelter, tools, toys, entertainment, advice, and assistance does even the richest person need? How much of these things does he even want? How much can he use? After a certain point, all that stuff is wanted not for use but for sale, and if a relatively few rich people own almost everything of value, for what can it be sold?

Here is the fundamental flaw of capitalism. It is predicated and focused on the accumulation of individual fortunes, which means that ultimately it undercuts its own basis resulting in economic breakdowns due to maldistribution of wealth and consequent depressed demand. Economists have gone to great lengths to refuse to acknowledge this. There is, or used to be, a concept in economics called "overproduction" or "surplus production" which meant that the economy was producing more stuff than people could use, so that in order to maintain full employment and productivity it needed to be sold abroad. But the economy has not historically ever actually produced more stuff than people could use (although that's theoretically possible). It has just produced more stuff than the people who wanted to use it could buy. That's a very different thing. The demand for goods and services depends not only on people's desire for things, but also on what they have to trade for them, and for most people the latter is exhausted long before the former. (Those for whom it is not, exhaust their desire to buy instead. Either way, stuff remains unsold.)

One of the things about economics today, even more than its disconnect from the economy of stuff and its focus on the arcane economy of money, is the refusal of many of its practitioners to think about the elephant in the room: the distribution of wealth. Even when an economist (by this stage of the game usually one long dead) takes a money-free approach, it often suffers from this flaw. A good example is Say's Law.

Say's Law is an economic principle attributed (somewhat incorrectly, but that's by-the-way) to the French economist Jean-Baptiste Say, who lived and worked in the late 18th and early 19th century. Say argued that there could never be a general glut of goods -- too much on the market to be sold -- because all goods produced created value with which to buy other goods, and goods are exchanged only for goods even when they are exchanged by way of money. As far as it goes, that's true -- but it also very much matters whether the goods produced are owned, and so exchangeable, by those who desire the other goods produced. Or in other words, it matters how widely wealth is shared. The fact that wealth exists to exchange for all products produced in the form of other products does no good on a practical basis unless those goods are in possession of those who wish to make the purchase.

One finds many critiques of Say's Law among economists, but rarely will one find this fundamental flaw recognized. John Maynard Keynes, for example, identified three assumptions underlying Say's Law: a barter model of money (goods are exchanged for goods), flexible prices (that can rapidly adjust upwards or downwards with little or no "stickyness"), and no government intervention. Keynes himself disputed the second assumption, arguing that prices are not necessarily flexible. Others have disputed the first or the third. (And here one does run into the distorting effect that arises from money-free economics, because there are aspects of a money economy which do not perfectly mirror a barter economy. However, that is not the real problem with Say's Law.) It's true that the idea does rest on at least the first two of those assumptions, but it also rests on another which is self-evidently false: the equal or near-equal distribution of wealth.

It's a curious thing, this refusal even of a supposedly "progressive" economist such as Keynes to address the central problem of inequality even though his own work naturally lends itself to doing so. Those who do address it usually seem to confine themselves to the moral aspects of it without considering the economic aspects. But the economic aspects are also real and also important.

Returning to the two functions of an economy, production and distribution of wealth, we may consider the template to be the economy of a pre-civilized community, in which a small band of human beings own all capital property in common and share tasks and wealth more or less equally. Production-side problems arose often enough in the form of shortages, but distribution-side problems did not. Even when production problems happened, it was never due to failures of organization, but only of natural resources, knowledge, or labor. The economy functioned in the manner Marx described as "communism," the end-state of his theoretical economic progression: from each according to his ability, to each according to his needs. Now, my personal opinion is that Marx had to have been smoking something to believe that an advanced economy, whose essence is impersonality, could ever operate communistically in this fashion. But we may nonetheless take that ancient pattern as, in terms of distribution and of the organization of labor and natural resources, the ideal, and evaluate our modern substitutes in terms of how closely they approximate this ideal. The truth is, of course, that they fall far short -- but in fairness, they have a much more complicated problem to solve.

In future posts, I'll consider historical economies that worked better than the one we have now, along with some spectacular historical failures. Finally, I'll speculate about alternatives to capitalism as it currently exists. In all cases, I'll approach the questions through money-free economics, in order to keep it as simple and non-arcane as possible.