Thursday, January 31, 2013

Fundamentalism = Facism?

Just finished Chris Hedges' American Facists (2007), which I commend to anyone who wants to read about the hucksters, pettifogers, shysters, and spiritual costermongers that populate the fundamentalist American Evangelical scene. Per the book's title, Hedges's purpose is to draw attention to the parallels between the Jerry Falwells, Pat Robertsons, and Tim LaHayes of recent American decades and the Adolf Hilters, Hermann G√∂rings, and Adolf Eichmanns of WWII Germany. There are parallels, yes. Strategies for the use of mass media, obsession with the sexual purification of the community, denial of the right of dissenters to participate in the deliberative political process, valorization of strongmen, etc.--just to name a few.

 The book might be read as a prequel to his attack on atheist fundamentalism When Atheism Becomes Religion, which was published the following. Though it is probably better understood as part of a larger project of exposing the outrages and hypocrisies of fundamentalist outlooks as such. As much as his atheist and religious targets of criticism despise each other, each accords the other the right to exist as a scapegoat, to be driven from the camp, as it were, in order to purge the community. Hedges is not willing to grant even that much. If someone wants to pretend to have plugged themselves directly into the truth of all things, and suppose that they possess a right to dictate terms to everyone else, they don't deserve the time of day.

What's more, Hedges can claim the privileged perspective of an insider--presumably an insider to both perspectives, but I will focus on his arguments against the religious right. Born and raised in upstate New York, a son of a pastor, who went on to study at seminary, before pursuing a career as an investigative journalist, Hedges details a disgusting process of divine domestication pursued by the religious right since the 1970s. This was an is a world of which I am only vaguely aware. A sheltered Canadian existence is relatively insulating. Something, I think, worth thanking God for. (Can I get an amen from my Canadian brothers and sisters? Still can't hear you! I said: CAN I GET AN AMEN?!)

Frank and brutal, the book describes how evangelical leaders have played on people's fears, demanded their obedience as an expression of faith in God, and fleeced their flocks for every last penny. Indeed, American Facists is perhaps too uniform in its portrayal of Evangelicalism not to raise questions. There is a hesitant suggestion that a younger Billy Grahman belonged to a different breed and generation of televangelists than did those listed above early on in the book. For the rest, it is difficult to find any hints of redeeming qualities in the movement.

Which raises questions about whom the book was addressed to? It's difficult to imagine it was written to inform the average Evangelical believer about the depravities of their beliefs. Hedges' target is continually moving. Here a scandal-ridden televangelist, there the legal hi-jinx of a church denomination, and here again the Evangelical movement as a whole. But Mr. or Mrs. Smith, standing on the corner of 1st and Main, in Anyville, Anystate, collecting donations for the food and clothing drive next Saturday? In some sense or other, they always seem to find their way into Hedges' sights: by identifying with the ministry or message of someone like James Dobson, by belonging to that denomination or mega-church, or merely by being Evangelical; but they themselves don't necessarily to figure into his picture. One gets the sense he preaches to the converted: to those who have already escaped the clutches of the same narrow-minded Evangelicalism he exposes. To his credit, he appears to recognize as much.

My difficulty with the book stems for this rather superficial analysis of the Evangelicalism. Moving so quickly from outrage to hypocrisy, from financial to sexual indiscretions, Hedges never gets around to saying more than clergy ought not behave like this.

Elsewhere in his writings Hedges has rather harsh things to say about finance capitalism (presumably he would not object to market capitalism). He ought to have pointed out how the forms of authority in Evangelical churches more and more imitates the forms and trapping of authority in the corporate world--which, unlike a hierarchical Catholic church order, invests in its leadership no real understanding of moral obligation towards others, and unlike a Presbyterian church order, invests no real sense of personal responsibility towards others. The evidence is right in front of his nose; he documents it in great detail. Pastors behaving like CEOs. The Gospel message pitched like a car sale. Outreach success measured in terms of profit margins.

Nor does he gets around to asking why it is we allow business leaders to behave in ways that we despise in our clergy.

Wednesday, January 30, 2013

American Politics, Set Against an English Backdrop

Watching American news networks, listening to Americans comment on the political gridlock in Washington or the cultural divisions that threaten to tear a nation apart, you would think things were better in the past. Why can't the Democrats and Republican get along? Why can't we all just behave like adults? This hysteria-inducing short-sightedness, as seen from the 'privileged' Canadian perch I occupy, prompts no small amount of eye-rolling, or, alternatively, head-shaking.

Most Americans, no doubt, will see the contemporary impasse as precisely that: a contemporary impasse. Very few will dig around and question possible historical antecedents.

In very general terms, for example, blue states and red states divide geographically between, on the one side, the Atlantic and Pacific coastal regions and the more north-east, and on the other, the mid-western and the south-eastern quadrants of the continental United States. Blue states tend to be more urban and industrial, while red states tend to be more rural and agricultural. These are broad tendencies, and don't match every particular county or city. But the evidence is there for all to see on electoral maps.

Now let's go looking for antecedents. Surprise, surprise; there is very little new about the present situation.

Take the identifiers of urban/industrial and rural/agricultural. In very general terms, the same division can be found between the largely Democratic and Confederate Southern States and the Republican Northern States during the American Civil War. (Recall here voting constituencies flip-flopped early in the 20th century.) And once more, the same division can be found in the American Revolution between a more urban and commercial New England, where the Revolution originates, and a more rural collection of Southern States whose economy was largely built on plantation exports.

I suspect most American historians stop here would be inclined to stop here. There is, however, one more antecedent that deserves mentioning, because it illuminates English involvement in both the Revolution and Civil War. This would be the English Civil Wars, three in total, extending from 1642-51, which are prefaces to Oliver Cromwell's Commonwealth and Protectorate (1649-59) and later also the Glorious Revolution of 1688. The same urban/commercial and rural/agricultural interests are at odds between parliament and landed nobility through the 17th century, and the conflict between these parties act as a restraining influence on British involvement in the Americas during both the Revolution and the Civil War.

What's especially interesting, if you follow this Anglo-American narrative through to its conclusion, is that two different models of government are at war with each other--figuratively speaking. A participatory parliamentarian model favoured by urban interests and a paternal model favoured by landed and rural nobility. Today these models would seem to correspond, at least on paper and in public rhetoric, to the social program-loving Democrats and the corporate money-loving Republicans.

What's curious, particularly for me as a student of religious history, is that these two forms of government mimic models of church government: an elected delegation in a participatory Presbyterian (Calvinist) model versus a hierarchical order in a paternalistic Roman Catholic model. It seems Henry VIII's Acts of Supremacy, making himself head of the English Church, may still have relevance today, as it provides a clear signal setting the cultural dialectic into historical motion.

The perplexing question that arises has to do with why, when Presbyterians consistently represented the urban/commercial or urban/industrial interests down through the 19th century, their successors, particularly among white Evangelical Protestants, seem to have aligned themselves with their old rivals.

Otherwise put: Why has 'religious' politics become more paternal, more about putting 'godly leaders' into office in Washington, more about the moral struggle for the cultural 'soul' of America, and much less concerned with participatory motifs like equal oppourtunity for all guaranteed by the rule of law?

Tuesday, January 29, 2013

Getting Ahead of Ourselves

The so-called North American culture wars between liberals and conservatives divides very neatly in terms of how our common human future is conceived. On the one side, stories are told of the progressive enlightenment of that portion of the human race that views the world through a rational lens. On the other, stories are told of moral decline and destruction, a consequence of the usurpation of centers of power by liberal forces.

Both sides are ahead of themselves. The former forgets that long-term progress is in absolutely no sense predictable, while the identification of markers of short-term progress is rarely more than a form of self-gratification. The latter misunderstands morality as something imposed by an external authority, who or which demands submission, when it is actually something imposed on oneself.

In the Hegelian sense, one thinks in a circle, whose true beginning is only revealed at the end.  In the Heideggerian sense, one thinks about the past and present out of an possible future. Which is not to say either side is explicitly Hegelian or Heideggerian.

Rather that both sides have a share in the peculiarly modern tendency, which includes the post-modern detour some especially critical thinkers seems to have taken, to judge the human past and present from conjured images of a time that might be.


Neither side, it seems to me, really takes seriously present human life, if by present we mean the limits of bodily existence, the secularity* of the political order. Were they to take such seriously, they might actually talk to each other, and seek agreement on the basis of mutually shared interests, rather than stoke disagreement on the basis of a need for a principled stance.

*By 'secularity', I mean that understanding of living in a middle age, between past and future, cut off from all abstract origins and endings--a thoroughgoing medievalism, if you will, in which origin and end coincide in a/the mortal human body.

Monday, January 28, 2013

Today in Timbuktu (updated)

French and Malian troops are in the progress of securing the city of Timbuktu against fleeing Islamist occupiers. On the mind of many will be ensuring the basic necessities of life, local infrastructure, and kick-starting civic society. Human rights violations, regional stability, access to resources, and other such considerations play into the calculus of foreign intervention.

Timbuktu is also home to a treasure-trove of pre-colonial African history. From the 15th to the 19th century, stood at the crossroads of North African politics, trade, and learning. The Spanish Muslim scholar Leo Africanus visited the city in 1510, and has provided us with one of the earliest eye-witness descriptions of the city. At the end of the 18th century, the mystique of the city, spurred by Leo's account, drew several Europeans attempts to reach the place.

Today a relatively small city, Timbuktu is believed to be home to a collection of some 300,000 documents standing witness to that pre-colonial history. The best inventory suggests many of these are in the hands of private collectors. About 40,000 were in the hands of an institute partly-funded from South Africa within the last decade. The adobe housing the institute was occupied by Islamists when they first arrived in March of 2012.

During the present conflict, a few voices have spoken up on behalf of this patrimony of generations now long dead. UNESCO distributed a list of GPS coordinates for the locations of the document collections to the French and African military last month.

As of this morning, the Associated Press reports that the fleeing Islamists torched the adobe housing the institute. Perhaps unsurprisingly, comparisons are being drawn to the destruction of a pair of giant Buddha statues dynamited by the Taliban back in 2001. The difference, in this case, being that the iconoclasm is not crossing boundaries of religious identification, but has injected itself into a debate over the true nature of Islam. Fundamentalist Sunni parties have weighed North Africa's Sufi traditions and found them wanting.

As I follow this story, I keep on running up against the question, Why care? Of course, Malians prize the documents because they testify to a proud heritage. Islamists destroy because the existence of his documents offends their sense of truth. Western interests have not seemed interested to invest money in the preservation of the documents. Though it does offend our cultural instincts to see the human past, which we otherwise seem not very much care about, go up in smoke. Nary a university or museum from Europe or North America mentioned on the Tombouctou Manuscripts Project site; the Ford Foundation being the only exception to the rule. The story provides us with a foil, a mirror to look into and remind ourselves we are enlightened, while they are obviously not.

The past is always held hostage by the present--its pressing concerns, petty disputes, immediate obsessions. The best answer I have to the question, Why care? begins with an observation about how the preponderant weight of the present has an ability to distract from obvious, but oft-overlooked, lesson of manuscript collections. If our intention is to use them to assert our own cultural superiority, these collections are not likely to support our weight. Like the mass of humanity trod underfoot while great men play the game of nations, the textual record is a silent witness to the underlying truth that all men are dust.

Manuscript collections provide us with this very honest estimation of our fragile selves. And that, my dear reader, is precisely why they ought to be preserved.

----------

Update: "Mali: Timbuktu Locals Saved Some of Their City’s Ancient Manuscripts from Islamists"
In interviews with TIME on Monday, the preservationists said that in a large-scale rescue operation early last year, shortly before the militants seized control of Timbuktu, thousands of manuscripts were hauled out of the Ahmad Baba center to a safe house elsewhere. Realizing that the documents might be prime targets for pillaging or vindictive attacks from Islamic extremists, staff left behind just a small portion of them, perhaps out of haste, but also to conceal the fact that the center had been deliberately emptied. “The documents which had been there are safe, they were not burned,” said Mahmoud Zouber, Mali’s presidential aide on Islamic affairs, a title he retains despite the overthrow of the former president, his boss, in a military coup a year ago; preserving Timbuktu’s manuscripts was a key project of his office. By phone from Bamako on Monday night, Zouber told TIME, “They were put in a very safe place. I can guarantee you. The manuscripts are in total security.

Sunday, January 27, 2013

The Ironies of Misunderstanding Misanthropy

With a title like this, I can hardly expect sympathetic readers. Aside from being needlessly wordy, it contains a double negative that stubbornly refuses to equal a positive. Somewhere someone quicker than I has a name for a literary device that combines double negation with alliteration. I only observe that the negated words both impinge upon us as human beings. Some human being has not understood what it is to disdain, hate, or not to trust human beings. Dreadful business. Hardly an encouraging reason to read on.

Misanthropy is a dangerous charge to throw around. Even if a person were to think another person misanthropic, common human decency cautions silence. It may be true. Then again, it may not. And if it's not, the accuser has opened themselves up to the same criticism. Because, in point of fact, they were the one's who lacked trust, misrepresented, and yes, most likely demonized the other person in the process. Had they trusted, had they given their interlocutor the benefit of the doubt, they would not have exposed themselves as the ass that they actually are.

I labour these points because I need to cover my own tracks. In a moment, I will point out that the misanthropic chickens in Wesley Smith article 'Environmentalism's Deep Misanthropy' could very easily come home to roost.  I rather think they do. But out of deference to Smith, I won't outright label his argument misanthropic. Instead, I want to tease some of the bits of his argument apart. The reader is more than able to pass their judgment on the matter.

Smith's basic contention is that environmentalism, through mouthpieces like David Attenborough or David Suzuki, is dangerously misanthropic. Attenborough is on record comparing the human race to a plague of locusts. Suzuki comes across no better, apparently suggesting we are fruit flies. If your mind immediately wanders to a vile Agent Smith from The Matrix describing the human race as a disease, you wouldn't be far off.

Except there is nothing to suggest that either Attenborough or Suzuki intend to channel Agent Smith's rather dim view of humanity. Ironically, and very much like Agent Smith, this Smith appears unable to distinguish between what we human beings are and what we do. (The distinction must be made if you to pass a moral judgment on someone's action.) In order for his argument to work, he stretches the critical comments of environmentalists make about human activities detrimental to the environment into determinative statements about human nature. And it is a stretch. For it amounts to saying, if we behave like fruit flies, locusts, or viruses, presto! ipso facto we are fruits flies, locusts, or viruses.

Smith's argument is an intellectual conjuring trick. The reason is that he deliberately indulges in exaggeration in order to vilify his opponents. Otherwise put: Smith is divining the 'signs of the times' with an impressionistic methodological approach on the level of a crystal ball or a Ouija board. He sees what he wants to see, but he doesn't hear what other people have to say.


That's not all. Smith falls on his own sword a second time. I note that Smith joins to misanthropy the charges of Malthusianism and radical wealth redistributionism. The Reverend Thomas Malthus claimed that population sizes increased exponentially, while the natural resources available to a population only increased linearly. The difference between incremental multiplication and incremental addition gave rise to a 'struggle for existence'. Charles Darwin borrowed the term in his Origin of the Species, which Herbert Spencer later described in more colourful terms as 'survival of the fittest'.

The second irony in the Smith article is that he confuses Malthusianism with grand attempts at social engineering. An actual example of Malthusianism would be the early 20th century German demand for more land, in order to secure natural resources for a growing population.

Translated into socio-economic terms, Malthusianism, which is also known as social-Darwinism, provides the basic argument 'I deserve to keep what's mine' against wealth redistribution. I quote from an online resource:
Social Darwinism also justified big business' refusal to acknowledge labor unions and similar organizations, and implied that the rich need not donate money to the poor or less fortunate, since such people were less fit anyway.
Environmentalists can be either Malthusian or proponents of wealth redistribution. They cannot be both.

Smith can oppose wealth redistribution, but that means he will end up looking a lot more like Malthus, and a little more misanthropic.

Saturday, January 26, 2013

The King's Body

In 1970, a king was buried in American soil. It worth pointing out that he wasn't an American, nor part of a suppressed lineage of disenfranchised American royalty. The closest the United States ever came to possessing its own royal house occurred when George Washington politely declined an offer of a bonified American crown.

With the sort of inevitably one can only perceive in retrospect, the bodily remains of the only king to ever have been buried on American soil was repatriated a short time ago, as the Associated Press, via FOX News, has reported. It's a curious story. The personal circumstances that brought a king to take refuge on American soil reveal something of the dramatic political upheavals in the 20th century. As do the circumstances that have taken him home.

So much of that, however, is left unsaid to meet today's rigorous standards of journalistic impartiality.

The king in question was Peter II Karadjordjevic, last king of Yugoslavia. German Nazism drove him from his homeland at a very young age; Soviet Communism kept him from returning.

Almost as interesting are the protests of Serbian royalists in the United States against the repatriation of the bodily remains from its resting place in the grounds of a Serbian Orthodox Church in Libertyville, Illinois. It seems as if two grand ideologies clash over his person: a modern conception of personal liberties, honouring the king's request to be buried on American soil alleged by Serbian royalists in America, and a much older idea that held kings possessed not one, but two bodies. There was the king's personal body, and the other was the body politic, the land and its people. It only seems fitting and good, as royalists in Serbia, including the dead king's son, allege, that the one body should take the other to its bosom.

And, I suppose, one must keep in mind that Yugoslavia is only formerly of that name. National politics is written somewhere through this story, in which ecclesiastical concerns get a certain amount of play.

Why else would American Serbians not want to see their king be finally laid to rest in the same ground that was his by royal right, even if not by recent political convention? What matters what king himself wanted? The king is dead; long live the king!

This story reported by the Associated Press is flat. Journalistic objectivity deprives it of temporal depth. Oh certainly, the very general contours of a story are introduced: Nazism, Communism, etc. These are, however, essentially non-entities. They are too big, and much too abstract, to provide answers to question of more immediate relevance. They explain how he got here. They may even explain, in a round-about way, why he is going back. But they do not explain, why some of the actors involved, don't want his body moved.

I wish there was an explanation. Because, if you note carefully, that it's this conflict which makes this story more than just simple reporting of facts.

Friday, January 25, 2013

Answering Dawkins' Question, What do Theologians Study?

Richard Dawkins has asked WHAT theologians study. Not only on Twitter, of course. Also here on his own website, and in various places throughout his published writings. His prose betrays a sense of exasperation. He honestly doesn't get it. The reaction of theologians appears cut from the same cloth: What does Dawkins mean WHAT do theologians study? Theologians stand in a long tradition of reflection on the ways of God with humankind. They are doing exactly what people have always done; the only difference being they are doing it in a scientific age. They honestly don't get what Dawkins doesn't get, or why he doesn't accept their arguments. The conversation seems to have reached an impasse.

My purpose here is to ameliorate the relations between these two solitudes. A little further on, I am going to attempt an answer to Dawkins' question about WHAT theologians study, one which works within theoretical boundaries he accepts. But first, an account of what I think are those boundaries is a necessary preliminary.

There are two types of theology that Dawkins rejects: natural and supernatural. The natural theology he rejects is of the sort found in the Natural Theology of William Paley, who saw evidences of an intelligent Creator in the complex constructions of living organisms. Paley held that individual organisms may vary in bodily appearance from one to another, but the species or generic types of organisms remain unchanged through the whole natural history of life on the planet. Evidence for divinity was in the beautiful and complex construction of the final product: the living organism. Then Charles Darwin, earlier an admirer of Paley, published The Origin of Species, which undermined the intellectual foundations of the idea of unchanging species. His work allowed us to think of the final product, the living organism as we presently observe it, to be the result of the long and somewhat haphazard process that is natural selection. The golden thread stretched between the unchanging nature of God and the unchanging nature of species, as a consequence, was cut, which naturally begged the question about whether there was a God to be found at the other end in the first place.

The so-called supernatural theology that Dawkins rejects is the idea of a supernatural agent who interrupts the natural processes of the world in order to affect some change on behalf of some and not others. Our world exudes a natural regularity, and we may be relatively confident that scientific methods are the means by which to understand that regularity. Old stories about a God calling Abraham out of Ur of the Chaldees to go to a promised land, healing the sick, feeding the hungry, smiting enemies, and, last but not least, raising people from the dead don't fit into this picture.

No unjustly, Dawkins places a premium on empirical observation. If the blind see and the lame walk, or some other so-called miraculous occurrence transpire, we are better off looking for a natural explanation. Be honest with yourself: were you to hear that Jimmy or Sally rose from the dead, you'd be skeptical. The question Dawkins puts to many Christians is this: what stops you from extending your skepticism towards resurrection of Jimmy or Sally, to the resurrection of Jesus?

A pious Christian response might be that they stand with Dawkins, as concerns the two other deceased persons, precisely because this sort of thing only happened once--with Jesus. The Christian is more than likely to accept the scientific argument from the regularity of the natural order; they just don't think, as a matter of faith, it applies to Jesus. And, they might add, there no way to determine scientifically either way--which, from Dawkins' perspective, no doubt beggars belief. It does seem rather arbitrary.

When Dawkins asks WHAT it is that theologians study, he wants them to show him something, anything, he can observe and verify through subsequent testing. Whether that observation occurs with the aid of a microscope, telescope, or a sensory apparatus in a large hadron collider feeding data that is displayed on a computer screen is immaterial. Give him something to look at, something that other scientists can look at, something that they can have a rational, dispassionate conservation about. That's all he ever asked.

----------

I propose to give him what he has asked for.

Let me preface the rest of this not-so-small disquisition by saying that I don't think what I have to offer will satisfy all the criteria for a properly scientific statement about the way things are. My intention is to provide Dawkins with something to think about: something he can observe, the existence of which can be verified by others, and something he can have a rational conversation with other people about. And I am going to attach the claim that this is what some theologians--big names like Augustine, Aquinas, and Kierkegaard--have talked about. In sum: classical Christian reflections on the first few chapters of Genesis and the first chapter of the Gospel of John usually gravitate towards some fundamental, observable phenomenon. The one claim I won't be making is that this is what all theologians talk about, as I am in agreement with Dawkins that a lot of what persons who self-identify as theologians have had to say, in the context of public discourse, is empty fluff. Whether it has meaning within their own communities is another matter.

I offer for consideration that there is a very 'real', and very commonplace distinction between human language, a category in which I include written, spoken, signed, and tactile words and things in the natural world to which they refer, like the sun and moon, rocks and trees, insects, plants and animals. So commonplace, in fact, it is easily overlooked. And I want to offer this consideration in the context of a much broader distinction what I shall designate 'artificial' objects, which include other things of human construction like building, clothing, computer games, and so on, and natural objects, like the aforementioned things populating the natural world.

But I don't want to identify WHAT the nature of the distinction between artificial and natural objects is precisely. It seems to me that jumping to identify the distinction will get us into trouble, as I will venture to explain more fully towards the end. All I want to observe at this point is THAT there is a distinction, and that we all recognize as much, at least implicitly. What is my evidence for that distinction? In an era when the world's natural resources are being harvested more purposefully and marshaled for the rapid extension of an urban civilization than ever before, but with much less forethought about the effect it will have on environmental conditions, examples are ready available. But let me borrow from Paley's famous Watchmaker Argument to illuminate my point for the sake of argumentative continuity. It was Paley's claim that encountering the complex construction of living organisms was like walking along a beach and discovering a watch half-buried in the sand. The watch naturally leads to an inference of an intelligent human creator; so living organisms quite naturally, it is assumed, gives rise to the idea of divine creator. The argument itself rests on an analogy drawn between human and divine creativity, crossing over the categorical boundary I have drawn between artificial and natural things.

Much ink has been split over whether the analogy is valid. That's all it is: ink. Let me be absolutely clear: right here and right now, I could not care less. My only concern is to note, were you walking along a beach and saw a watch half-buried in the sand, you could be absolutely certain a human being was involved in its construction. In all probability, more than one human being was involved. The shape is too circular, the edges too straight, to be the composition of natural processes, given our knowledge of those same processes. The same goes with words. Words inscribed on any material surface whatever always have a human origin. For example, when the Scripture says that rocks will cry out if human beings are silent, the intention is of text is analogical, and the being who originally put pen to paper, or quill to parchment, or what have you, to record the analogy for future generations was a human being. End of story. Full stop. Period. This far, no further.

I trust these observations are so basic, so banal, that there isn't any reason to labour these points much further. Perhaps my own wording could be improved upon. The fundamental point, however, about how am I able to use words to indicate a difference between a whole category of artificial things including words, which have human beings are their 'cause', and a whole category of natural things, which do not, should not be contentious.

If these claims are contentious, for anyone, I am unsure how a person would communicate their objections without proving these claims in the process. They are readily observable AND verifiable.

----------

None of the foregoing sounds like the sort of argument a modern scientist would present in an academic paper. That's true. One persistent feature of the modern scientific endeavor has been to throw back the veil covering over natural processes. It may appear that the earth is in the center of the universe, but Copernicus showed us how to think otherwise, Galileo elaborated upon those initial observations, and the rest is history. A whole web of overlapping scientific finding down through the last few centuries all tend to confirm the Copernicus' initial heliocentric hypothesis. Another example: you may have the impression that matter is solid. Disabuse yourself of any such idea--quickly! The list keeps growing.

What the tool-using capacities of humanity evolutionary cousins? What about the intelligence of dolphins? What about the capacity of many 'higher species of animals to suffer? Don't these fields of contemporary scientific research demonstrate that the distinction that I am trying to introduce for consideration rests on a faulty assumption? Well no. My claim was THAT there is a distinction and that it is real. I avoided wading into a discussion about WHAT the the nature of the distinction is or WHERE precisely that distinction lies. It's the nature of human involvement in and with the natural world, that the lines are more clearly drawn in generality than they are with respect to particular points of interest. My intent was never to ' throw back the veil covering over natural processes'. In that sense, I willingly grant, the evidence I offer may have the slippery feel of a dialectian's obfuscations.

In my defense, all I can do is point back to the distinction. A word on a page has a human being as its cause. The particular dimensions and texture of a page also as human beings as its cause. The same cannot be said for the natural material, from which the ink was prepared or from which the page was manufactured. Human beings came into this world, individually by birth, and collectively as a species by evolution, to find all of that natural material already present, the question of its origin just hanging around, as it were, waiting to be asked.

Now, some might expect that this is the end of my argument. That is, however, not the case. The great imponderable question of the Origin of All Things is only a stop along the way. Think about it. If the great imponderable question were the end, we would have gone a long way to say nothing. Or more precisely, by saying everything, we would have said nothing, because we failed to say something that the human mind can wrap itself around.

----------

I began by pointing to an observable (and verifiable) difference between words and natural things. For example: the word 'tree'. There is the word itself, and the thing to which it refers. We all know what words are, and we have all seen trees. Look outside. In fact, the word 'tree' is not the only symbol I can use to communicate to another person what it is I am looking at, or thinking about cutting down, or carving my name into, or whatever else a person might do with a tree. Different languages have different words for tree. Different languages have different ways of qualifying what sort of tree it is. I will stick with English because I find myself linguistically challenged. The tree in question may be a maple tree, it may be tall, it may be old tree, and so on

I am the cause of this particular instance of the word tree's usage, in precisely the same sense that I am not the cause of the object to which the word refers. It is my subjective intention, however, that this particular perceptible instance of the word tree should refer to that perceptible object. I hold the word and object together in my mind. As I know that we are both drawing on a shared English lexicon, my intention is further that this particular perceptible instance of the word tree should refer to that perceptible object in your mind as well.

So, at this point, neither the tree itself nor the word itself is my principle 'object' of interest. Rather: my subjective intention to communicate that the two are joined together in my mind is the object. The intention to associate words with objects is something that is so ubiquitous in human thinking, of course, that it is very easily taken for granted. This is not an excuse, however, for pretending it is inconsequential.

When certain theologians reflected have reflected on the natural world and human beings in it, they have noted with the author of the second chapter of Genesis that human beings name things. In some sense, naming a thing is integral to understanding what it is and communicating one's knowledge to others. The human ability to name things frees us to mentally toy and tweak the thing described. Certain circumstances allow for us to take an idea and transform available material, as, for example, what happens when a tree is cut down for the purpose of building a house.

One can safely assume, I think, that these theologians were suitably impressed with how the human being was capable of such remarkable feats in the first place. Augustine's discussion of the difference between words and things in De Doctrina Christiana is sensitive to these interests. Why should such a frail thing as a word be able to attach itself to an object and illuminate the human mind about the nature of objects? Does it also not say something, not merely about the frailty of human creations, but also about the frailty of human intentions that not all the words we employ in certain situations are acceptable to other human beings? Everyone has had the experience of being told they are wrong or being offered a different perspective.

What do theologians study? Some theologians have studied observable and verifiable distinction between words and the things to which they refer, or again between artificial and natural objects. The difference between the two members of each pair is a real difference. What is done with the difference remains to be seen. Some theologians have gone the route of contemplating the imponderable question of the Origin of All Things. The first chapter of Genesis--by setting up the contrast between God, who speaks things into existence, and the human being, bearing his image, who names already existent things--does more than contemplate. It sketches out a possible answer to the imponderable question. It's not the only answer, of course.

One gets the impression, however, when reading post-Kantian philosophy and modern scientific theory that this readily observable and verified distinction is irrelevant to modern science. Only what, it seems, fits into a theoretical or hypothetical framework can be counted as real. Behind the rest is placed a question mark indicating a possible field of further study.

This doesn't mean that the distinction is not real. No indeed. All it means is that by some frail human intention, whether consciously or not, some think about the world in a way that the distinction is irrelevant.

Thursday, January 24, 2013

I Believe in Kevin O'Leary: A Conversion Story

I am a convert to the Kevin O'Leary brand. It took me all of six months, but I have come full circle.

Americans will best know O'Leary as a venture capitalist on the ABC show Shark Tank. Canadian know him playing an identical role on the CBC show Dragon's Den. Mr. Wonderful is the name he gives himself, and it isn't meant ironically. On the Lang and O'Leary Exchange, co-hosted on CBC Newsworld with Amanda Lang, O'Leary preaches the Gospel of Money--or Mammon, in the KJV rendering of Jesus' pointed observation that you cannot serve both God and...you guessed it, money. O'Leary is the real capitalist deal. Small government, not only in the sense of less actual services, but also less regulation, and praise for the virtues of free markets are both part of his liberating message

And if you think I am exaggerating by dropping biblical allusions, read the introduction to his book The Cold, Hard Truth: On Business. Money, and Life. 'When I speak the truth about money, I am almost speaking as money. That's why I come across as harsh, mean, and brutal. I'm just channeling money, in my attempt to help you understand it and amass it.' O'Leary is THE prophet of profit. His brash, difficult to digest exterior has earned him a large number of critics, which would only seem to confirm his exalted status. 'Verily I say unto you, No prophet is accepted in his own country.'

My conversion to the Kevin O'Leary brand had two parts. Doubts about initial feelings of revulsion that I felt towards all that O'Leary's on-screen persona represented prompted me to look closer at who he was and, more importantly, what he had written. The Cold, Hard Truth is O'Leary's blitzkrieg on all that fluffy sentiment about the way the world ought to be and should be that clouds a businessperson's financial judgement. No one needs to 'like' what O'Leary has to say, in the same sense no one needs to like what Machiavelli had to say in The Prince. But it doesn't mean that what he has to say is not true, if by true you mean that it has purchase in a real world in which a person needs food on the table, clothes on their back, and a roof over their head. His latest The Cold, Hard Truth on Men, Women, and Money promises a similar shock and awe campaign against wishful thinking.

The second part of my conversion occurred when Amanda Lang proceeded to excoriate O'Leary for accusing media figures, like herself, of a liberal basis against big business a couple of nights ago. On most issues, I fit quite comfortably into Lang's pocket. When media figures report on the iniquities of exceedingly large corporations, I think they are doing their job. Just because a person has a whole lot more money than I do doesn't automatically make them correct. Apologies, Mr. O'Leary.

But Lang rose up with righteous indignation at O'Leary's suggestion widely missed the mark of 'objectivity'. Who was he to call the ideals of her professional vocation in question?

That was the moment my eyes were opened. I saw a golden, 24-carrat halo on O'Leary's head, and the angels of finance fluttering in the background.

I have no difficulty with a liberal media bias against big business. I do have difficulty with a liberal media pretending it has no bias. If bias is not buried in a story's details, then you can find it in the sorts of stories they choose to tell.

In her direct, articulate way, Lang described the nature of O'Leary's sins against the journalistic profession. But she hid herself, and her own personal interests and motivations, behind a protective shroud of journalistic objectivity.

O'Leary simply stared into the camera and smiled. He had nothing to hide behind, nor any need.

Wednesday, January 23, 2013

Wake Up

When asked if he was a saint, angel, or god, he responded simply, 'I am awake'.

The nature of his response is such that we are still trying to make sense of what he said, and that's the point.

The man in question was Siddhartha Guatama, formerly a minor princeling of the Sakumani clan, who lived around the turn of the 5th century B.C.E in Northern India or Nepal. His name means 'he who achieves his aims'. Shortly after birth, a soothsayer divine was summoned to divine the child's destiny. The signs were fortuitous, though the path unclear. Either he would be a chakravartin, which means universal king, or he would found a new religion.

A prince himself, the father desired his son become a conquering hero, and promptly lavished upon him all the trappings of royalty. Palaces were filled with feasts and dancing girls. A young woman was made wife; she bore him a son. The story of the Buddha's early life can be read against the backdrop of the Hindu caste system. He was born a kshatriya, among princes and warriors; his people looked askance at brahmins, the sages and priests. Becoming the chakravartin was the default option.

What does it mean that the Buddha woke up? To begin with, it means that father failed to entice son with pleasures or power. It means that the Buddha went on to found a new religion. And it especially means that the Buddha gained insight into the nature of human life: suffering. He saw suffering was suffering, and the joys and tangible pleasures of human life are also suffering, because they are fleeting.

One can, with a certain amount of ease, describe the Buddha's path to enlightenment. It reads like a dialectical ascent, in which every step negates or subsumes within itself previous steps. The narrative form of a life is easy for a reader to slip into. Herman Hesse's Siddhartha is an appropriate place for the English reader to start. The story of the Buddha is, or ought to be, the story of Everyman. And paradoxically, by being the story of Everyman, it is also only the Buddha's story, just as everyone has their own stories. He can show you the way and he can show you how to start; the end is yours to discover on your own.

Seated beneath the Bodhi Tree on the 49th consecutive day of meditation, the Buddha awakened to the discovery that there is no solid, substantial self, no immortal, deathless soul 'beneath' the bodies that we are. What this might mean is especially difficult to communicate. For example, it makes a certain amount of nonsense of the statement, 'I am awake', since there is, in the final assessment, no I in me that wakes up.

The implications one draws from what appears at first blush a simple logical contradiction the depends on the sorts of assumptions  brought to the table.

Through the 19th century and into the 20th century, Buddhism has earned a considerable amount of respect among American and European readers. The most well-known contemporary representative, the Dalai Lama, speaks to sold-out crowds wherever he goes. The absence of a personal God seems especially appealing to scientifically-minded folk: no apology need be offered for the apparently arbitrary actions of a divinity who reaches down into the flow of human history to tweak circumstances to the advantage of some and not others. So also is the apparent critique of an immortal soul, so integral to many Christian accounts of salvation. At the same time, accounts of the Buddha's former lives, and the many saints, angels, and gods that populate the Buddhist cosmos, are conveniently overlooked.

To post-Kantian readers, who are numbered in the above mentioned groups, Buddhism reads like a critique of Cartesian rationality, but in fact it is nothing of the sort. The Buddha's message is a critique of bodies that consume other bodies for the sake of securing for themselves perpetual and pleasurable existence. To check human desires, the Buddha discovered that one ought not fight with one's desires as if they were one's own. When the 'I' from the 'I want this' or 'I want that' is cut out of the equation, so also is the want for 'this' or 'that'. More and better food, bigger and grander houses, younger and looser sexual partners, those material things of the world that go beyond what I require to maintain my own existence: the futility of these the things is what the Buddha woke up to.

The Western readers, owing to a penchant for conceptual abstraction, have a difficult time understanding what is suggested. The tendency will be to fixate on some apparent logical contradiction, like the above statement about there being no I in me to wake up. The statement is meaningless apart from actually setting off down the Buddha's Middle Way, however, which directs persons neither to gorge themselves in physical pleasures nor deny themselves the basic necessities of life.

Armchair intellectuals and causal bloggers, like myself, cannot actually claim to 'understand' the Buddha's teachings. We draw a distinction between theory and practice, where one does not exist.

Tuesday, January 22, 2013

Justice Antonin Scalia, a Moron?

The speculation is that the hat Justice Antonin Scalia wore to President Obama's inauguration was a replica of the hat worn by Saint Thomas More in Holbein's famous portrait. Everything appears to be speculation at this point, as I am unaware that Scalia has revealed his mind on the matter. But even if it should hold true, we are still left with what to make of Scalia's symbolic provocation.

Symbols have a way of escaping the intentions of those who use them. A Catholic himself, and a vocal critic of Obama's public policy, Scalia most likely wore the hat as a silent protest against Obama's presumptive exercise of Presidential authority. On this reading, Obama assumes the role of Henry VIII, who usurped the spiritual authority of the church by making himself head of the English Church. Scalia plays the part of More, whose protests earned him a martyr's crown, and, in due time, sainthood.

There is a second possible way to interpret Scalia's actions, albeit a much less likely one. It has to do with his claiming common cause, not merely with a political martyr, but with a bonified saint enrolled in the ecclesiastical calendar of the Catholic Church.

It is remotely possible that Scalia has absolved Obama for his outrages against both himself and God's church. That's what saints who are martyred do--saints like Saint Thomas More. They swiftly forgive their executioner, commend the souls of all present to God, and patiently wait for the axe to fall, the hammer to drop, the rope to snap, or what have you. That's what Christ did in Luke 23, as he hung on the cross.

Let's explore the later possibility just a little further, just for the sake of argument.

The renaissance scholar Desiderius Erasmus' In Praise of Folly (Moriae Ecomium) is usually published with a letter written to More. The letter is a reflection on the apparent etymological connection between Folly and More, which is still preserved in our word moron. The letter commends Erasmus' panegyric on folly the protection of More, against its detractors, who mistook its praise of folly for a frivolous defense of idiocy.

The wisdom of Folly:
"Briefly, no society, no association of people in this world can be happy or last long without my help; no people would put up with their prince, no master endure his servant, no maid her mistress, no teacher his pupil, no friend his friend, no wife her husband, no landlord his tenant, no soldier his drinking-buddy, no lodger his fellow-lodger -unless they were mistaken, both at the same time or turn and turn about, in each other."
A dose of foolishness helps us bear with each other.

If Scalia did laid claim to the foolish saintly aspect of More's legacy, he is entirely unjustified. Peter Ackroyd's The Life of Thomas More beautifully chronicles More's the inner turmoil as he groped to understand whether obedience to the king was necessarily disobedience to God.

When he had made up his mind to oppose the king, More determined to let the ways of the world to have their way with him. He is even recorded to have said to his executioner, 'Thou wilt give me this day a greater benefit than ever any mortal man can be able to give me. Pluck up they spirits, man, and be not afraid to do thine office. My neck is very short: take heed, therefore, though strike not awry for saving thine honesty.' Much like Christ, again, who goes willing to the cross.

This kind of foolishness Scalia most likely did not intend. If indeed it was a replica of More's hat that sat on his head, his symbolic appropriation debases the memory of More's saintly martyrdom. In death, saints do not so much confront the worldly powers with another worldly power sanctioned and 'sanctified' by a Church. In their own persons, the demonstrate the futility of all worldly exercises power, including their own. Saints transcend mere political grandstanding, however subtly that is enacted.

Justice Scalia is not a this kind of moron. That's why, we can hope, it wasn't More's hat he was wearing.

Monday, January 21, 2013

No (Religious) Label

I was educated into an intellectual tradition that insisted everyone was religious, even if they were a-religious. Fundamental to our human constitution, so the assumption went, was an internal yearning for existential fulfillment, which could not ultimately be satisfied by just this or that. It had to be satisfied by something transcendent capable of embracing every this or that you could point to, think about, and name. In the language of traditional metaphysical argumentation, to paraphrase St. Thomas Aquinas, this something transcendent is what was mean by God.

My educational programming predisposes me towards thinking that the need for an existential fulfillment automatically manifests itself empirically in terms of identifiable, observable religious affiliations, even if those affiliations are a-religious.

Enter the Pew Forum on Religion in the Public Square, which published a study that argued '"Nones" are on the Rise: One-in-Five Adults have No Religious Affiliation'. What do we make persons who refuse to affiliate--whatever that might mean?

Somewhere between the prescriptive ideal of the human being as a religious being and a descriptive account of the reality of things, our best theoretical attempts seem to go awry. Predictable defensive attempts to distance 'religion' from a surrounding culture ethos that prizes therapeutic solutions to our everyday problems avoid altogether the problem of religious affiliation. True religion, presumably, is more than a panacea  and affiliating with the likes of the occupants over in the next pew can be trying stuff at times. It takes true, God-glorifying, grit and demands the best of us. Here affiliation is the norm against which everything else is judged. Other attempts to patch the breach introduce a half-way house category of 'spiritual, but not religious', which has a limited purchase, though is not without its own difficulties.For example: who actually qualifies for membership? and, do they have any say about whether or not they accept membership? The crux of the matter, after all, is whether affiliation is both identifiable and empirically verifiable.

A more sensitive reading of the present religious ethos is provided by Daniel Silliman:
Perhaps the "nones" aren't mainly to be thought of as religious or non-religious in a new or different way, but as a group that has a problem with the question, in much the same way that some protest self-identifying with a particular race.

Those people aren't themselves a new demographic, but rather are saying that they think the category is a problem.
My own experience with interacting with students in a class studying world religions inclines me to agree. Students seem fairly comfortable examining all religious possibilities, without ever feeling their that personal failure to commit themselves to one, to the exclusion of all the others, is a significant problem. They travel lightly through life, so to speak, unlike their grandparents.

The problem may be purely categorical. An empirical study like that made by the Pew Forum contains an implicit assumption that all the possible answers to questions about religious affiliation cut from the same cloth. Attending church, mosque, synagogue, or temple is regarded as conceptually equivalent with not attending anything; in same way antelope, moose, and deer are treated as generic equivalents to the caribou. The difficulty is obvious: a negation the members of a set does not amount a positive addition to the set.

Looked in a slightly different way, the problem may conceal a moving target. Even if you insist that everyone is essentially religious, while allowing that some are superficially a-religious, for example, this is still not quite the same thing as saying everyone is essentially a theist, while allowing that some are superficially atheists. Different things are being referred to. The term religion directs us to consider a set of readily identifiable human phenomena, sets of beliefs and ritual practices oriented towards a transcendent reality. Atheists could manifest these otherwise religious markers. A.C. Grayling has put out a canonical collection of atheist writings titled The Good Book, while posthumous devotional works by Saint Christopher (Hitchens) pour off the presses. Certainly sounds religious. On the other hand, the term theist directs us to consider...what exactly?

The term religion appears broad enough to include within it everything that might be defined under the term theist. A theist is someone who believes in the existence of God--which is to say, theism is a religious belief. But theism also entails a stronger claim that cannot be subsumed under the definition of religion. Namely: the existence of God, though it may be bound up with, cannot be reduced to the human belief in the existence of God. For the theist, God is never merely a belief; if the human mind is not able to comprehend God, nor can it expunge him from existence.

The category of religious affiliation, as Silliman ably articulates, fails to comprehend the 'nones'. Curiously, his 'nones' find themselves in exactly the same position as my 'theist'. The question is why.

One obvious answer is that the superficial markings of religious affiliation never go deep enough. They provide a rough and ready map of the social terrain, but inevitably distort its image, much like a 2-D map of the globe does the spherical object which it represents. Relative to the center, the edges of the map are grossly exaggerated.

The less obvious, but no less important, answer is that we have to think through what exactly it is we suppose our categories represent. And conversely, we need to question what they can accomplish for us. Do we use our badges of religious identification to define ourselves over against everyone else? If the 'nones' have discovered that the interpretive powers of categories of religious affiliation are limited, it won't do to fault them for not affiliating.

Sunday, January 20, 2013

Religious Freedom for Everyone?

The whole debate about the status of religious freedom in the United States taxes a person's credulity. Most of the commentary I read is written from a position of certainty. Religious freedoms are being eroded. This is happening. Listen up: you need to be aware.

The certainty is unnerving. Something is happening, I will grant you, but I am not quite sure what exactly. The intensity of conviction with which the case for a fundamental challenge to religious freedom is being made, I would venture to suggest, is an affront to all true religion--all true piety. Christ did say, after all, You know neither the day nor the hour. These words of caution apply not only to the End of All Things, but to every single moment leading up to it. They are what inspired St. Augustine to write The City of God, whose main thesis can be summed up, 'Okay. We might be here for a little longer than we expected.' They ought to remind us, with a measure of certainty, that every word that escapes the human mouth has a necessarily questionable purchase on the things spoken about.

Half a day ago, Daniel Silliman tweeted an article describing how 'Most Americans Think Religious Freedom [is] Fast Declining in US'. Here's a summary of the findings of a study carried out by a California-based research group:
More than half of Americans (57 percent) believe "religious freedom has become more restricted in the U.S. because some groups have actively tried to move society away from traditional Christian values." This opinion is more common among practicing Catholics (62 percent) and Protestants (76 percent) and is nearly a universal perception among evangelicals (97 percent).
Never doubt that there may be found some nugget of truth buried in the well-intention beliefs of a randomly sampled number of adults. It is difficult to know, however, whether everyone surveyed understood the same thing when they heard the words 'religious freedom'.

The first question that popped into my head was: Religious freedom has become more restricted relative to what? The same in the 1990s? 80s? 50s? 20s? The question of temporal scale always has to be kept in mind. However you define religious freedom, it is undoubtedly advancing with great strides relative to the 16th and 17th century wars of religion in Europe, the 4th and 5th centuries in the Roman Empire during the struggle over the establishment of Christianity, and so on. The way the study was formulated, one gets the impression that impressions about the contemporary prospects of religious freedom were simply being drawn from personal memory.

The other question that arose in my mind had to do with the assumed nature of religious freedom. Was it identified with the freedom of conscience, the freedom to believe, which, in principle, is the right of every person? Or is it the freedom to congregate with one's fellow believes and participate in the rituals of one's religious tradition? Neither of these possibilities seem likely, since those surveyed seemed to identify a 'restriction of religious freedom' with a collective shift ' away from traditional Christian values'. When listening to Christian America speak about religious freedom, it is very difficult not to draw the conclusion that not everyone is free in quite the same way.

It seems President Obama is out of step with a majority of Christian America. On January 16th, by Presidential Proclamation, he declared the day Religious Freedom Day. Most people, one assumes, either doubted Obama's sincerity or his intellectual ability to identify that of which he spoke. On the other hand, Obama followed Thomas Jefferson where he 'affirmed that "Almighty God hath created the mind free" and "all men shall be free to profess . . . their opinions in matters of religion."' Historical references to constitutional precedents are usually a good indication of intellectual acumen. Reference to Almighty God, whether seriously intended or not, signals that whatever religious freedom is, it is meant for all, in the same way.

These are very different conceptions of what religious freedom entails. It's really no use pretending that we are talking about the same thing. The difference between them gets at an important question about how far one allows particular sets of convictions to dictate public policy. Because if Christian America identifies a restriction of religious freedom with the retreat of traditional Christian values, it is not religious freedom itself that is being challenged. It is a particular set of convictions. The article concludes:
"Evangelicals have to be careful of embracing a double standard: to call for religious freedoms, but then desire the dominant religious influence to be Judeo-Christian," said Kinnaman, author of the book, unChristian. "They cannot have it both ways. This does not mean putting Judeo-Christian values aside, but it will require a renegotiation of those values in the public square, as America increasingly becomes a multi-faith nation."
Nor, I would add, does one have to assume a 'renegotiation of those values in the public square' automatically entails that Christian will have to set aside Christian values. Though they may have to think through carefully the one's they presently hold so dear.

I myself don't think religious freedoms are being fundamentally challenged in the United States. A more likely explanation is that demographic shifts are forcing the Christian America to question whether it is quite as exceptional as it makes itself out to be.

Saturday, January 19, 2013

Twitter Karma

A story hit the internet a few days ago that may have slipped the attention of most. The US Library of Congress has begun to banking the entire record of the Twitterverse. Did the Twitterverse notice? Not really, as far as I can tell.

The idea that someone or something is out there recording every thing we might think to regurgitate online is supposed to terrify us. This is Big Brother that George Orwell warned was about, isn't it? The reality, as James Gleick describes it, is not quite so straight forward.
This is an ocean of ephemera. A library of Babel. No one is under any illusions about the likely quality—seriousness, veracity, originality, wisdom—of any one tweet. The library will take the bad with the good: the rumors and lies, the prattle, puns, hoots, jeers, bluster, invective, bawdy probes, vile gossip, epigrams, anagrams, quips and jibes, hearsay and tittle-tattle, pleading, chicanery, jabbering, quibbling, block writing and ASCII art, self-promotion and humblebragging, grandiloquence and stultiloquence. New news every millisecond. A vast confusion of vows, wishes, actions, edicts, petitions, lawsuits, pleas, laws, proclamations, complaints, grievances. Now comical then tragical matters.
I confess, every so often I tweet, but I have no illusions about the infinitesimal significance of absolutely every tweet with my name attached to it. I also have no qualms about the Library of Congress indexing my ephemera. Nor everyone else's ephemera either. The fear of Big Brother is founded on an idea that behind every piece of technology there is a unpredictable, possibly even capricious, personality waiting to stick it to you. It's to let your imagination run away on your rationality.

Presumably if someone searched for my name, they could find whatever I had thrown into the vast Twitterverse. Some of it might be incriminating in some small way or other, but that has less to do with the technology, than it has to do with the person interested, and, more to the point, with me. If that hypothetical person followed the changing content of my tweets, they might be able to make sense of my changing interests. But how much would it actually reveal about me?

The Twitterverse could be thought of as governed by the law of karma. Whatever has already been tweeted follows us around wherever we go, with the potential to jump up and kick us in the ass at the worst possible moment. Though this is not because karma has it out for us. Karma is absolutely impartial, because whatever it does to us, we have, in point of fact, done to ourselves.

Of course, on the other side of the coin, the more likely way things play out, is an image of a indistinguishable straw in a haystack.

Combined with a little prudence on your part, the likelihood of of Big Brother finding anything incriminating is slim to nil--and slim just left town.

Friday, January 18, 2013

The Lessons of History

I trust everyone is familiar with the old platitude about how those who don't study history doom themselves to repeat it. In fact, the platitude does not go far enough. Absolutely everyone is doomed to repeat the so-called mistakes of history. There are no exceptions. The historically enlightened, in this instance, are no better off than the historically ignorant. The mere possession of knowledge does not always lead to action. Knowledge of the human past likewise rarely yields clear actionable directives.

The lessons of history lie elsewhere. On the weekend of the US Presidential Inauguration, for example, an obvious lesson to be learned every time someone makes use of the word like historic or historical to describe the significance of the event is that you may be confidant that they are telling an untruth. (Wolf Blitzer, I am talking at you.) Judgment about historical significance needs the advantage of hindsight, hopefully at least a couple of decades, but preferably a few centuries. It does the rest of us no good if commentary on the epochal significance of something happening right this very moment is being offered up willy-nilly. And it ought to reflect badly on the person making the gross overstatements. Goodness knows claims to the sort of god-like prescience needed to determine whether this particular event, occurring right here and right now, before our very eyes, will be decisive to the unfolding of the history of a people, a nation--lo, the human race itself!--should give us pause.

The problem is that it rarely does. Intellectual modesty proportionate to the human dimensions of our lives is largely absent from public discourse. It enough to make a person cry for Heaven to restore Karl Marx from the grave to teach us once more his analysis of the production of false consciousness. The gods, in everything but the name, walk among us once again. A new bourgeoisie foists its vision of impossible greatness upon a mass of slack-jawed proletariat. Against their better socio-economic interests, the proletariat embrace the intangible feeling of belonging to something larger than themselves. Crushed under the burden of an economic order designed to squeeze them for every possible penny, freedom is dangled like a carrot on a stick, held out for them to taste, if only they work just a little harder to reach it. The circle is vicious, terminal even.

One lesson that might be learned studying history may be summed up in a formula. The level of histrionic content in any assignment historical significance is indirectly proportional to the temporal distance between the event and the judgment passed. Which means the closer those two are together in time, the greater the chance that our would-be historian is a liar--in the strict sense of holding up something as truth that corresponds to nothing in reality.

Another is that Nietzsche was wrong: there is no twilight for idols, if the sun never sets.

Thursday, January 17, 2013

The Quixotic Study of Religion

As someone who instructs students in a course on the study of world religions, I have to ask myself the question whether this sort of thing is even possible. If I speak to students about Buddhism, Hinduism, or Islam, for example, I labour under the obvious difficulty of not being an adherent or devotee--or more precisely, of not being an insider, being an outsider looking in. I understand the import of this objection, though it is not the end of my troubling story of apparent self-deception. The contemporary academy prizes specialized knowledge, and the more specialized, the better. I haven't spent the requisite 10,000 hours needed to become an expert on anything on every religious tradition examined in the course, which, all told, would amount to at the very least 70,000 hours. It would seem, consequently, that I shouldn't bother try teaching a course that covers at least seven major religious traditions.

But we are 'Down in La Mancha', where anything is possible, if only you believe hard enough. Here windmills are giants, and are easily dispatched with a bold cavalry charge. In the spirit of 'tilting at windmills', I want to suggest the subject matter of a course on world religions, in point of fact, encourages the development of a peculiar sort of expertise in everything. And it won't take 10,000 hours to master the basics.

The following is taken from the Isa Upanishad, which stands to the rest of the Upanishads like Genesis 1 stands to the Bible and the Al-Fatiha to the Qur'an.
Whoever sees
All beings in the self (atman)
And the self in all beings
Does not shrink away from it.

For the one who knows,
In whom all beings have become self,
How can their be delusion or grief
When he sees oneness?
The only real barrier standing in the way to comprehension of what these verses entail is language. As the text has been translated, English reader already possesses all they need to know to understand the meaning of the text.  The way forward is not to get lost in the foreign-ness of the text, but to consider where one might situate oneself in the text.

The meaning of all beings is obvious: all things, anything you can think of, everything you can think of, etc. The comprehensiveness of the statement is important. Detractors who doubt whether a course on world religions will ever be able to do justice to individual traditions, on account of the impossibility of mastering all the particulars, are served a notice of eviction from the conversation. Get lost in the particulars, and you mistake what is distincitively religious for something else: culture, tradition, politics, etc.

The references to perception ('Whoever sees', he sees'), mental states ('who knows', 'delusion') and emotional responses ('shrink away', 'grief') are also fairly straight forward. The sort of seeing implied is not the perception of visual objects. In English we very easily exchange the verb 'to see' for 'to understand'. The same applies in this circumstance.

The final element, the self, is the most difficult, because it is not immediately clear who this refers to. Myself? Ourselves? God? This is the wrong way to go about thinking of the self. The better way in is to ask how it is possible that I could see 'All beings in the self' or how 'all beings have become self' in myself.

In other words, how is it that I can put an entire world/universe of things into myself? Think about this carefully. It is not actually as counter-intuitive as it seems. It is rather quite intuitive. Everything you think about is, in a certain sense, inside of you. For example, I am thinking about Ayers Rock in the Australian outback. I am appropriating it to myself. The same goes for any other thing I might think about.

Now, the obvious objection is that these things don't actually become myself. They still have existence outside of myself. Well, yes. That's the typical natural scientific response. On the other hand. the world's major religious and philosophical traditions are more interested with the peculiar, and seemingly endless, capacity of the human mind to learn about things. As long as we have the bodies that we do, there will always exist an impediment, the Isa Upanishad suggests, to the complete realization of the self in all beings, and all beings in the self. Our bodies prevent us from absorbing the entire world into ourselves. Our bodies place limitations on what we can know. I can only read so much, see so much, study so much, before I grow tired, or before I die. A sage or swami can have an insight into the truth of existence in this life; but the ultimate goal is for the mind to shed the limitations of the body to embrace in itself all things.

Tuesday, January 15, 2013

Reflections on Memory and Death

My father's father is dying. I traveled to my hometown this past weekend to visit one last time, and was able to talk with him for about an hour and a half, much as I have in times past.

The conversation ranged through the storehouses of our memory. Recounted were memories of working together over the better portion of a decade, while I still lived at my parent's house, just down the road from my grandfather's farm. There was also the memories of living with my grandfather and grandmother through the summer months after my first year at university. He followed me readily enough down these different lanes, only being tripped up once along the way. And with a little more confidence, he recalled episodes that predate even my father's life; from times during "the War", as he calls it, from getting off the train in small-town south-western Ontario, shortly after emigrating from Holland to Canada.

My grandfather has the good fortune not to be afflicted by Alzheimers, like his sister, or my mother's father, who died a number of years ago. Not that his memory is perfect, nor that his mental acuity is what it once was. He did seem self-conscious enough, though, to be able to recognize his memory is not what it once was. This sort of presence of mind, I take it, is a ready indication that one still is in possession of one's faculties.

My grandfather does not seem much perturbed by the fact he is dying. When I first arrived, he had rather dryly observed that it was quite busy at the house because so many people were stopping by to visit. His humour was so dry that I initially mistook this for a matter-of-fact observation. Only afterward did I recall his slight smile.

Towards the end of my visit, I observed that my grandfather had a considerable amount of time to think about the end of life. I ventured to tell him I was happy to know that he was content with his lot in life, especially now that life was coming to an end.  My grandfather pointed out that young men do not think much on their own mortality. His implication seemed to be that I had showed my youthful cards by venturing as much as I had.

That got me thinking about why youth so readily associates with the illusion of immortality. I could imagine, for example, that the difference between short-term and long-term memory affords us two very different senses of temporal distance, which do not very easily coalesce. The contents of short-term memory have a certain poignant immediacy lacked by long-term memory. And the contents of long-term raise questions about when our memories, and by extension our lives, began--whereas the contents of short-term memories simply suppose our perpetual existence. Draw a line from the past, through the present, into the future from long-term memory, and you are left with a line extending from birth until death. Though doing the same through short-term memory leaves you with your old, familiar self who peers a few days into the future: the self that always seems to be there, just hanging around, that never really changes, expect, if you consider it in the light of long-term memory.

I admit that the division drawn here between short-term and long-term memory may be arbitrary. I trust that everyone has had to 'step back' and consider their life in a longer perspective. Let the distinction stand for the sake of argument under that qualification. As one approaches the end of one's life, and the time remaining is counted in days or weeks, not years and decades, it would seem that the two temporal horizons begin to fuse. (Heidegger's rumblings about Dasein's Being-towards-Death have a certain purchase here.) I can imagine that one might just as easily panic at the prospect of running out of time.

My grandfather seems to have only contentment, the quiet assurance his Lord is taking him to a better place. I suppose one has to be there to truly understand. I suppose that means I am still young enough to wonder.

Monday, January 14, 2013

The Ontological Argument for the Existence of God

(This post is prompted by Chris Hallquist's 'The ontological argument in brief', posted on The Incredible Hallq, hosted on Patheos.)

The ontological argument for the existence of God is one of those arguments that truly perplexes post-Kantian thinkers. It does not attempt to prove the existence of God from any sort of observable phenomenon, i.e. any of the things of our mundane experience about which you or I might think, but from the act of thinking about the idea of God. In more common parlance, the claim of the ontological argument is that God's existence can be known by reason alone. To summarize St. Anselm's version of the ontological argument:
1) Since God is that being than which can be conceived no greater being, and
2) since existence in reality is greater than existence in the mind,
3) therefore, God must, by definition, exist.
For (the corollary argument goes),
4) if God exists only in the mind, then
5) it is always possible to conceive a a being greater, namely, one that exists in reality,
3) therefore, God must, by definition, exist.
At the end of the 18th century, the Prussian philosopher Immanuel Kant ruled this sort of argument out of bounds by observing that to claim something exists adds nothing to the idea one has of that thing. The argument may be summarized with Kant's pithy, but opaque, statement, 'Existence is not a predicate'.

The specific example he used to illustrate the point was the Prussian equivalent of dollars (talers). To wit: The idea of $4 that I have in my head does not change if I add that these $4 actually exist. Regardless whether the money exists or not, the idea of $4 always stays the same. And, if the idea doesn't change when I add that the $4 also exists, the converse must also be true. Namely: that it is not possible to infer existence merely from an idea. Of course, the $4 might exist, and then again it might not, but there is no way to determine whether it exists from reason alone. The same consideration applies to unicorns, warp drive, my five beautiful wives (of whom only one actually exists), and the same goes for God.

What Kant's critique of the ontological argument underscored was that the existence of God--if he exists at all--cannot be the same sort of existence as the existence of the things of our mundane experience. Trees, tables, television shows, and the like, fall into this category. Kant did not seem to allow, however, that there could be another such existence, a divine existence, that stands apart from the things of our mundane experience. Most post-Kantian thinkers have followed Kant on this basic point, which should go a long to in explaining why today belief in the existence of God is a de facto subjective belief, without any objective content beyond its personal significance for the believer. Indeed, the intellectual deck is has been stacked against believers for the last couple of centuries, thanks to Kant.

Thursday, January 10, 2013

The Prospects of Online Education, con't.

A number of weeks ago I had a conversation with Dan Mullin about the prospects for online education over on Facebook, in response to an opinion piece I posted on this blog wondering about the same. We came to something of a disagreement over the importance of accreditation in a future job market. It was my contention that most of the tech journalists commenting on the spread of online education resources misunderstood the nature of traditional education. They saw it merely as a product to be consumed, and missed out on the important role colleges and universities played in accrediting students for potential future positions in the workforce.

This is important for how one values education, which, unlike food, clothes, and movie tickets, is one of life's very-difficult-to-value intangibles. In order to create value for a course, say, on the study of world religions, Shakespeare's sonnets, or the political transformation of England under Henry VIII, the course is unlikely to do very well if it is just thrown into a marketplace of knowledge, where consumers pick and choose a selection of personal interests. Dan has made the case that there are niche markets for this sort of thing, while I would argue niche markets to do provide a steady stream of income for higher education. It's a far better business model which rolls all of these particular interests into a larger product: academic accreditation. Otherwise arcane topics need to packaged along with other similar courses in a degree that comes with the value added bonus of accrediting the student for a career in a field in which these arcane bits of knowledge are tangentially relevant. It was my thinking that if companies offering courses online were to survive, and make money, they would have to find some way to integrate themselves into the existing system of accreditation, on which the traditional system of higher education still has a monopoly.

Its possible that Dan and I have different ideas in mind when we use the word accreditation. I want to offer, for Dan's consideration, this article posted over on Slate, whose title takes the form of a question: 'Would you pay $100 for a free online college course?' The answer: maybe. Coursera is experimenting with the possibility of offering students the value added bonus of a certificate for completing a course.

But the more immediate question is this: How many people will be willing to pay $100 for an online course that most others are taking for free?

Only a tiny minority, in all probability. After all, the certificate doesn’t count for college credit anywhere, at least at this point. And it doesn’t even necessarily mean that you didn’t cheat. All it does is make people marginally more likely to believe you when you tell them you got an A-minus in, say, Computational Methods for Data Analysis.

But don’t count out the project’s potential just yet.

This is still a long way from the socio-economic value of a universally recognized diploma or degree form an institution in the traditional education system.  But it seems to indicate that companies like Coursera and Udacity recognize people do not generally consume knowledge merely for the sake of consuming knowledge. Other incentives are needed--like the possibility of social advancement.

This is one very important sense, contrary to Dan's contention that traditional system of education no longer enjoys the confidence of a wider world, that colleges and universities still have a leg up on their online competition.

Wednesday, January 09, 2013

The Mirror of Princes

Through the medieval period and into the Renaissance, the bookshelves of royalty, who may not themselves have been able to read, but certainly employed people who could read for their listening pleasure, were often populated by tomes which we now call 'the mirror of princes'. These were instruction manuals for how to rule well, which might take the form of a history, or possibly a more sophisticated theoretical treatise. Among their number are included the riveting History of the Franks by Gregory of Tours and the pedantic Education of a Christian Prince by the Renaissance scholar Erasmus. These days we love our diagrams and flowcharts, our 12-step programs and idiot's guides to everything under the sun. The medieval authors of the principium specula, however, learned a lesson from Plato. In order to 'see' an idea in action, a theory put into practice, you had to tell a story about that idea woven into a human narrative.

CNN is the first channel I turn to in the morning, and usually the last channel I turn to before going to bed. As an avid viewer, it has come to my attention that America suffers from a dearth of good leaders. Most of this, I understand, is rhetoric, but even rhetoric contains traces of reality. When President Obama won't accommodate Congressional Republicans, for example, he is accused of failing to lead. Read between the lines. What you discover is that Congressional Republicans are actually complaining that Obama is refusing to see things their way. Lobbing the failure-to-lead barb across the aisle turns out to be a subtle way of saying you ought to follow us. But America doesn't seem to lack books, the intellectual successors to the principium specula, purporting to teach exactly what effective leadership is. Indeed, it is strange that America should suffer from a dearth of leadership at exactly the same time leadership programs across the educational spectrum are more popular than ever before. Everyone who is anyone is either enrolled or offering their own seminar.

Talk about leadership borders on the absurd. Leadership is what we need. It's also what we're missing, and where we should be investing our money. And what is this most rarefied of commodities? The contemporary discourse about leadership is fraught with dialectical implausibility. We say need leaders. More likely we want you to pay for our leadership seminar--which we will be leading. We talk about leadership as if it is tangible commodity. More likely we exercise it over you by naming it, in the process claiming it for ourselves. After all, unless you are the one talking about leadership, unless you are defining what it is, you are a following someone else's lead. The logic of leadership is either/or. It has to be. Someone must take the initiative, or someone must have the final decision. This is why, incidentally, the republican John Locke was also a defender a royal supremacy.

The medieval authors of the principium specula might also be accused of engaging in the same sort of duplicitous talk. After all, kings did the commanding; learned men did the thinking and writing. This could be taken as an example of a tail wagging the dog. One lesson that the medieval authors might afford our contemporaries, however, is that the exercise of leadership is, in some sense, an exercise of moral authority over others. Being that, it cannot be talked about in the abstract, nor objectified, nor commodified, without missing the essential point about what it is to lead. That is also why every promise you hear from a contemporary author or 'leader' to reveal their secret formula for effective leadership actually boils down to a string of unrepeatable, personal anecdotes about what a person did in this or that situation that allowed them to rise to the challenge and overcome all odds. It has to do with interpersonal relations, which are ultimately not quantifiable, measurable, nor constrained by formulaic description.

Instead of talking about effective leadership, which can border cultic adoration of a personality (...cue reference to Hitler), we should be talking about the communal context of effective leadership. Instead inflating certain persons beyond all human proportions, we ought to reflect on the humanity that binds us all together. The only way to escape the either/or logic of leading and following is to begin by insisting the relation between the two is not absolute. And that means tempering our language about leadership.

Tuesday, January 08, 2013

Utopia, a la John Maynard Keynes

There is much to admire in John Quiggin's article on the forgotten utopian ideals of John Maynard Keynes. An economy founded on a Greco-Roman principle of leisure, the tantilizing prospect of fifteen hour work weeks, and the glaring light these shine on the inequities of ye olde trickle downe effect can be listed. Calls for dignified work with a decent wage have long since been drowned out by law-makers, bankers, and CEOs, mindful of boards of governors with the fingers pointed squarely at the bottom line. The political will to see these things accomplished seems to have been exhausted. The activities of trade unions are being curtailed through so-called 'right to work' legislation, the number of hours in a work-week have increased, and the mandatory age of retirement creeps ever higher. Governing officials respond by saying they are merely responding to the pressures of a changing marketplace. But that's the point; or, more precisely, that's the problem.
An escape from what Keynes called ‘the tunnel of economic necessity’ is still open to us. Yet it will require radical changes in the economic structures that drive the chase for money and in the attitudes shaped by a culture of consumption. After decades of finance-driven capitalism, it takes an effort to recall that such changes ever seemed possible.

Yet it is now clear that market liberalism has failed in its own terms. It promised that if markets were set free, everyone would benefit in the long run.The goal of maximizing profits while minimizing costs will not ultimately square with the ideal of a decent wage.
A more efficient marketplace is not the same thing as a better social order. Today we are guided by different stars, and it's naive to think we are still traveling towards the same place.

The glaring presumption that there ever has been, or ever will be, a traversible road to utopia ought not be counted among those things to admire in the article. Whether we imagine that road to be paved by material progress, moral discipline, social planning, or technological development, the skeptical regard which Thomas More bore his own Utopia is a salutary reminder to denizens of the genre. Quiggin suggests there will be a point in the near future, given present material and technological progress, when no one will 'need' to be poor. The choice of terminology is telling. The suggestion seems to be, should poverty still exist beyond that point, it will be because humanity has perversely stood in the way of what was otherwise inevitable. Here is a prime example of a hopeful monster if there ever was one. If it can be thought, the hope says, it can also be accomplished, but the monstrous means would have need to be employed to achieve the supposedly inevitable end tell a cautionary tale. Utopia is quite literally no place, or not a place, so no road will ever get you there.

Humanity is made from crooked, knotty stuff, which does not conform to abstractions so easily as someone intellectually inclined might like to think. Not without good reason did Plato consign his Republic to the realm of ideas only vaguely approximated through great effort in bodily reality. Nor was Augustine merely giving voice to a unjustified pessimism regarding human nature when he placed the New Jerusalem at the conclusion of the present age. Hopeful monsters, like the one Quiggin proposes, are rather the progeny of an intemperate impatience. It is a trap not easily avoided by modern liberals, who profess love for humanity in enlightened abstraction, but are severely disappointed by its colourful exemplars.