Monday, December 23, 2013

A Christmas Story: The Coronation of Charlemagne

On Christmas Day in the year 800, the Frankish king Charles the Great, who better known as Charlemagne, was crowned the Holy Roman Emperor.

Christmas stories are various and sundry. There are children’s stories like Rudolph the Red-Nosed Reindeer, the scriptural narratives of Jesus’ birth, or morality tales like Charles Dickens’ A Christmas Carol. The story of Charles’ coronation stands apart from the rest by being both factual and mainly a matter of academic discussion.

Unlike Rudolf or Scrooge, Charles is an actual historical figure. Unlike Jesus, the contested meaning of his earthly life does not inspire cultural crusaders to draw up battle lines. Though impressive in his own right, Charles cuts a rather mundane figure across the backdrop of human history. With so many other impressive figures from which to choose, he easily slips from view.

But this is not to say his story should not be taken down from the shelf and dusted off once and awhile.

The details of his are uncontroversial. Charles entered St. Peters to celebrate Christmas mass. While kneeling before the altar, Pope Leo III placed a crown on his head and proclaimed him the Holy Roman emperor.

No one contests that a coronation took place on Christmas Day in the year 800. Owing to the distance in time and the scanty amount of evidence presently available, however, the meaning of the coronation is not entirely clear. Was the coronation the brainchild of the king or the pope? Which party stood to benefit? His biographer Einhard tells us that Charles was caught unawares when the pope named him emperor. Doubtless this was a well-crafted piece of political theatre. The humble king may be seen by all not to have sought this high office. The pope, as the exalted head of the Church, the Vicar of Christ, raises another up to oversee the mundane, worldly affairs of Christendom—its administration, defense, and the like, so that he could get back to the business of shepherding men’s souls to their eternal home. Both the ends of Church and State were served.

Now, since he already had a kingdom, Charles gained nothing but a title in his coronation. The oldest son of Pepin the Short, he was co-ruler of the Franks with his brother Carloman from 768-771. Just as war between the siblings seemed about to break out, Carloman died from what were apparently natural causes. The source materials give us no reason to suspect any misdeed on Charles’ part. Nor would we expect them to. It is almost inconceivable that any medieval historian would begin the narrative of the reign of so successful of king as Charles with betrayal and murder. Lacking evidence, we may only gesture into a speculative void about what was actually the case.

But it is fairly easy to infer from the events of his life that Charles wanted an empire--and by implication the title that went along with it. Much of his life was spent in the saddle. He saw action in present-day France, Spain, Italy, and Germany, as well as on the islands of Corsica and Sardinia. Everywhere the borders of his realm were extended, northwards into Saxony, southward beyond Barcelona, and eastward towards the Danube. To his credit, Charles was also responsible for a series of economic, monetary, educational, and ecclesiastical reforms. These efforts were collectively responsible for the brief flowering of Frankish culture known as the Carolingian Renaissance. They could not have been affected, except for the support Charlemagne lent to the Church and to its institutional reform.

Ties between the Frankish court and the papal curia grew apace. The Church was a source of both clerical (i.e. religious) and clerical (i.e. administrative) support. So when Pope Leo fled Rome in 799, he went to Charles for help regaining the papal throne. Charles made his way to Rome in November of 800 at the head of an army. And a little more than a month later, at Christmas mass in St. Peters, Leo crowned him Imperator Romanorum, ‘Emperor of the Romans’. The moment had a double significance: it restored to the West the imperial authority that had departed three centuries earlier to Constantinople in the East; and it formally severed ties that had already been severed in practice with the Roman (read: Byzantine) Empress Irene in Constantinople. The material resources of the Eastern Roman Empire had long since become insufficient to maintain the temporal holdings of the Church in Rome proper against the depredations of the Lombard lords. The Western Church needed a Western champion.

Einhard claims Charles had no knowledge of the pope’s intention, and would not have agreed if he had: ‘[H]e at first had such an aversion that he declared that he would not have set foot in the Church the day that they [the imperial titles] were conferred, although it was a great feast-day, if he could have foreseen the design of the Pope.’

As we call know, every Christmas story ought to have a moral. That being the case, we must search for moral to the story of Charles’ imperial coronation.

Perhaps we might see in Charles’ coronation a caution against confusing the jurisdictions of Church and State. The pairing of Charles and his biographer Einhard are strikingly similar to Constantine and his biographer Eusebius. The latter pair has come under considerable criticism in recent centuries for drawing the politics and religion closer together than our modern liberal sensibilities are comfortable with. We might also draw a salutatory lesson about the grand pretensions of political theatre. The vociferous claims Charles made about not having wanted an imperial mantle do not pass the smell test.  The formal imperial inauguration only confirmed in theory what had already come to pass in practice.

These assessments contain a measure of truth in them; but they both go against the grain of human history. Time always moves forward, even as we look back and assess from where exactly we have come. The past was no different. Whatever judgment we pass against Charles must be made with this in mind.

So we ought to judge Charles against his predecessors, just as we must judge ourselves against our predecessors—one of whom among was Charles. 

The appropriate comparison is made with the pharaohs of Egypt, the kings of Babylon, and the emperors of Rome. Set alongside their imperial rhetoric, Charles’ denial to have sought an imperial mantle is startling. It matters very little, in this light, whether Charles’ display of humility was only pretense. What matters was that pretense was necessary in the first place. Over the course of a millennium, the basic forms of political legitimation had been entirely inverted. The rulers of the ancient empires claimed to be gods or sons of gods. At least in theory, even if theory was not completely realized in practice, they were the all-powerful manifestations of divinity on earth. They were the soul animating the body politic; the lives of men were theirs to dispense with as they saw fit. The degree of divinity to which a ruler in the ancient could lay claim, in fact, appears a function of the density, size, and sophistication of the civilization. The larger the political entity, the more precarious its existence; which meant that the measure of authority required to maintain a political entity increased exponentially, until it became only natural for kings to liken themselves to gods.

A clear division between Church and State is the first indication that something fundamental had changed in the transposition from the Ancient into the Medieval worlds. The authority of the gods over men is no longer concentrated in a single person; it is differentiated into separate forms, which serve to limit each other. The key to the transposition is in the pretended humility. The basic form of authority in the Ancient world was of the gods over men. Whereas the basic form of authority in the Medieval world was of God become man, and more precisely, a vulnerable infant. The actual exercise of authority may still entail the command of one man over the rest, but, to borrow a phrase from Yeats, it now slouches towards Bethlehem.

Sunday, August 25, 2013

Funding the Humanities

I won't bother claim to be a dispassionate or disinterested proponent of the humanities. A program of religious studies, like the one in which am I enrolled, is about as humanistic as it gets. Not everyone would agree with the characterization, of course, but the inference is a sound one. Faculties of religious studies got their start as programs in the secular or scientific study of religion. They were supposed to be non-confessional; and so were much less concerned with in the nature of divinity than they were in what this or that belief in divinity said about the human beings who held them. Focus was on the one thing about human beings that is difficult to explain away on other terms: why we believe what we think to be the case about X (where X can anything under or over the sun), rather than merely what we think about X.

So my numbness to the fact we in our collective wisdom have decided the humanities simply aren't valuable in the broad scheme of things thaws a little when the economist Christina Paxson offers 'The Economic Case for Saving the Humanities' over at the New Republic. The piece is an effort to turn the table on the standard arguments against funding the humanities. If we in our collective wisdom deemed the humanities valuable, some of the monies pouring into faculties of science and medicine would be reassigned to history, philosophy, and the fine arts. The federal government would apportion a lot more money to research grants in African literature or Asian antiquities. And employers would eagerly hire persons demonstrating a capacity to learn, critical analyze, and achieve research and/or other goals. 

But money isn't pouring in. Paxson points out the rationale for governments to invest in the so-called S.T.E.M. subjects--science, technology, engineering, and mathematics--is a simple matter of connecting the dots. The payout is calculably predictable, much like the sort of stuff dealt with in the subjects themselves. The same cannot be said of the humanities. Figuring out the monetary value of a study of the relationship between the two parts of Miguel Cervantes' Don Quixote, for example, is like tilting at windmills. A post-structuralist reading of Xenophon's portrait of Socrates suffers from similar pecuniary under-determination. These cannot be quantified in the same way the matter of the S.T.E.M subjects can be quantified. The consequence is that public servants, who must give an account of their funding decisions to their respective political constituencies, err on the side of caution and control for those variables which can be measured. And for the time being, the humanities live off a dwindling institutional memory of better days.

So we need to learn how to argue, Paxson says, 'there are real, tangible benefits to the humanistic disciplines—to the study of history, literature, art, theater, music, and languages.' No doubt she is right. We do need to learn to argue for the tangible benefits of humanistic study. Obviously we, especially those of us in the humanities, have forgotten how to make such an argument.

The 'economic' character of Parson's are problematic. Their weakness may be seen in how they haphazardly circle around the point. Here's a sampling:
'[I]t is evident that many of the men and women who were exposed to that curriculum went on to positions of genuine leadership in the public and private sectors.'
'[W]e do not always know the future benefits of what we study and therefore should not rush to reject some forms of research as less deserving than others.'
'We should be prepared to accept that the value of certain studies may be difficult to measure and may not be clear for decades or even centuries.'
The first argument appeals to anecdotal evidence, to contingent circumstances, not necessary conditions. The second and third argument brings in epistemic considerations about the inability of our metrics to predict the shape of the future. Most notable, these aren't peculiarly economic arguments. All three appeal to a rough and ready practicality. Well aware of the reasons offered for why the humanity ought not to be funded, Paxson skirts around the question why we ought to fund them.

Let me take a stab at answering the question. The strongest argument to be made for increasing funding to the humanities is that they, like so many of the other things we value in our lives, have no obvious, measurable, practical purpose. As paradoxical as this may seem, it gets at something essential to being human. The immediate payout from reading a good novel is almost non-existent. More likely, you spent money in order to purchase the novel. The same goes for conversations in coffee shops, reading the newspaper, or watching the news. The list goes on. We do these things because we want to, because, for whatever reason, we enjoy doing them, not because doing so has an obvious dollar value attached to them.

The idea of an entire human life ought to be subject to market discipline revolts even the most hard-nosed of capitalists. (Hence they spend extravagantly on the so-called superfluous aspects of their own lives.) For that reason, and that reason alone, the humanities needs a humanistic defense grounded in what it means to be human, not an economic one pegged to balance sheets and bottom lines. The proof is near and dear to every single one of us. The latter concerns cannot be ignored, of course, but they have their particular place in well-lived human life, rather than the other way around.

Where do you look for the basic inspiration behind such a reordering of priorities? Usually in religious texts, among other places. The first chapter of the Book of Genesis describes the creation of the world, and the creation of human being's in God's image. No reason is offered for why God created the world. The only thing the reader can make out is that God did. The consequence is that human life, existence itself, is best understood as the product of a supremely pointless divine act. Not to despair, though. Things don't end badly for the human race. The text of Genesis finds a reflection of God's supremely pointless act in the human being, a creature created in the image of its Creator.

The creation of humanity in God's image is one of those catch-phrases, like other ones insisting every human being is possesses an intrinsic dignity invested with certain rights merely by virtue of being human,which illuminate the rest of the world. We reason from them towards some conclusion, not towards them from other premises. Like so much of human life that cannot be rationalized on the strict terms of the hard sciences, things are because they are--or, more precisely, because we want them to be.

The image of the humanities as a beleaguered bastion of light holding out against an assault of bankers and bean-counters won't pass a smell test. The problems facing studies in the humanities are much bigger than mere institutional arrangements the immediate problems of funding allocation. Fiddling while the humanities slowly burn to the ground is something we have collectively determined to do, including persons claiming to work in the humanities. Stanely Fish comes immediately to mind. The malaise of a modern education is subtle and pervasive; it goes much deeper than individual figures, deep down into our basic assumptions about the way things are.

The demise of the humanities follows upon our collective failure to see human life as anything more than an individual can make of it. We live in communities, of course, but we have forgotten how to think about life as if it is lived in the community of others. So we fiddle while Rome burns, and pretend not to understand those things each of us individually desire for ourselves--a roof over our heads, clothes on our backs, food on the table, the company of family and friends, and a modicum of freedom explore this short life's possibilities--aren't also collectively desirable.

In the end, the demise of the humanities isn't merely about a small number of academic disciplines. 

Friday, August 23, 2013

The Interfaith Identity Crisis

About a week ago, the Washington Post argued the nature of interfaith endeavours has shifted with demographics. A more diverse population means interaction between religious groups is no longer restricted to the clergy. In fact, a typical practitioner can now be expected to have some sort of contact with persons of different faiths.

Children who grow up and go to college or university today have very different experiences than their parents. Interfaith used to be something people did. Now it is something people live daily. Though there now exist twice as many interfaith groups in the United States than a decade ago, making the generational transition has been difficult for many. Old assumptions are being challenged, and questions of new priorities must be raised.

In a Huffington Post article, Rev. Donald Heckman, Executive Director of Religions for Peace USA, suggests the interfaith movement must rebrand itself. The term means too many things to too many people to convey anything definite to the wider public. In response to a growing number of persons who do not identify with any particular religious tradition, he says,
'I think we may need to cede the term "interfaith" to the small but growing number of people who see faith, religion and spirituality as boundary-less enterprises of exploration and who allow for multiple affiliations. And the more narrow technical term "interreligious" needs to be co-opted to cover the broad arc of things that are multi-, inter- and intra- for -faith, -religious and - spiritual.'
But is problem really just about branding? If it's about religion, doesn't it go a whole lot deeper than the question of what a person calls themselves?

Heckman is asking the right questions. The way he is asking them, however, leaves something to be desired. The deepest motivation of the interfaith movement has always been to bring people together. And that makes the wisdom of more carefully parsing the names we apply to ourselves doubtful.

The problems the interfaith movement presently faces are perennial problems, which have taken on new forms in a new context. Seen in that light, answers to questions about how to move forward should become more obvious.

The basic problem has always been how one engages persons of other faiths while remaining true to one's own faith. How can I both be a Buddhist, Christian, Hindu, Jew, Muslim, etc. and engage constructively with persons of other faiths?

There seems an assumption, especially in certain Evangelical Christian communities, the logic of religious identity is ironclad: one can be either this or that, but not both. And the only reason to talk to members of other faiths is to convert them.

Rather than rebranding, the interfaith movement should be retooling. Since more and more people are living the interfaith movement on a daily basis, what is needed more than ever is to equip and teach people to find inspiration for interfaith engagement within their particular religious traditions.

I don't mean glossy presentations of the things religions share in common, though that must be a part of it. I mean encouraging Christians to think on what it means to see everyone as being created in the image of God, Muslims what it means to be Allah's representatives on earth, Hindus as jivas, and so on.

Our religious traditions, without exception, classically wrestled with the dignity and misery of being human. They set out to achieve the impossible goal of reconciling the entire human race to each other. They also cautioned against presuming too much about one's own abilities to accomplish that goal. The labels we gave ourselves, in this picture, matter a whole lot less than actual flesh and blood.

The interfaith movement needs to see itself not as a solution to a problem everyone else has. If that were the case, then rebranding is all that's needed. The interfaith movement needs rather to see itself as taking part in the very thing people have been working at for many millenia. Only then will it catch up to the truth that people are living interfaith lives every single day.

Tuesday, August 13, 2013

A Review of Arvind Sharma's Gandhi: A Spiritual Biography

Here is a question worth pondering. Has a biographer really done his subject justice when God appears in a life’s story as an actual actor, and not just as a literary device, inspirational thought, or private conceit?  At stake in the question’s answer is truth. Not THE TRUTH, mind you. Not what truth is; but much more importantly how truth is told.  Has a biographer told the truth of his subject if the divine majesty is allowed to skulk between every line of every page?

The truth is, or ought to be, it seems, much more mundane.  In truth’s unvarnished form, readers confront the cold, hard stuff of the real world. Right?

The question’s answer cannot be so simple, however, when a biographer sets out to write a spiritual biography.  The Yale University Press has just published Gandhi: A Spiritual Biography (2013) by Arvind Sharma of McGill University in Montreal, Quebec. With the opening lines, Sharma warns, ‘History is more than the biography of those who make it’, and immediately counters, ‘Nevertheless, some people leave their mark on history in such an elusive way that historiography perpetually fails to capture it.’

Gandhi was such a person, Sharma suggests, along with Moses, Jesus, and the Buddha, and a small number of others. Most biographies on Gandhi are written about Mohandas Gandhi. They refer to Mohandas with the honorific Mahatma, or ‘Great Soul’, but are concerned with events and people, politics and social processes. A spiritual biography of the man takes Mahatma Gandhi as its subject, and looks what it means to be a mahatma.

Sharma’s credentials certainly qualify him to write such a book. The Birks Professor of Comparative Religion at McGill’s Faculty of Religious Studies, Sharma uses his specialization in Hinduism as a bridge to much more general topics, including religion and feminism and religion and human rights. He is the author of One Religion Too Many: The Religiously Comparative Reflections of a Comparatively Religious Hindu (2011). The book is Sharma’s spiritual autobiography, a chalk full of wry observations about growing up a Hindu and encountering other religious traditions along life’s way. After the Gandhian fashion of marrying faith to social activism, Sharma has also convened two international conferences looking at religion and human rights: World’s Religion after September 11 in 2006 and the Second Global Conference on World’s Religion after September 11 in 2011. A third and final conference is now in the works for the second half of 2016.

Every one of Gandhi’s biographers must confront the question about the source of his power to inspire. The ends of spiritual biography, Sharma’s argument runs, are much more appropriate to Gandhi’s fundamental motivations than are other sorts of biography. It goes to the heart of the matter, so to speak, to the place where word intersects with deed. ‘Gandhi’s claim was made upon our conscience; he demonstrated that spirituality is to be found at the core of our humanity.’

Sharma’s discussion is lively. At points, even if a little dialectical and didactic, the prose dances off the page into the reader’s imagination. Spiritual biographers risk falling into hagiography, but Sharma demythologizes Gandhi in order to preserve his saintliness. Gandhi demythologized himself, Sharma points out, by attributing his larger-than-life accomplishments to God. If he was a saint, his saintliness was in part due to his willingness to own the flaws of his character. Sharma examines a number of them in the course of the book.

Which God did Gandhi serve precisely? Good Aristotelians the lot of us, we may argue over the specific nature and attributes of the divine majesty—or whether it makes sense to speak of God existing or as existent. Whether, in our intensely analytic moments, we master our language or it masters us remains to be seen. We also stand to miss the point, was the point I took away from the Sharma’s book. Gandhi died with three bullets in his chest and the name Rama on his lips. He identified Rama with Truth, wherever it may be found, but especially through introspection and selfless service.

God as Rama as Truth could never be a mere propositional statement. The reality of God must be lived in order to be known. The insistence on identifying word and deed, Sharma points out, led Gandhi to his death. He was assassinated because he insisted India fulfill promises of a third payment to Pakistan because India had given its word. The fact the two countries were then at war could not change his mind. Gandhi took it upon himself to see the promise fulfilled; the name Rama on his lips, his final gesture was one of forgiveness to his executioner.

Gandhi: A Spiritual Biography divides neatly in half. The first half treats significant episodes in Gandhi’s life. The second looks at significant themes in his thought. The book does not propose to be an exhaustive study, though it most certainly qualifies as an illuminating and instructive one. The author may be forgiven, therefore, if readers find themselves wondering how Gandhi got from a point A to a point B, or what motivated him to make the move. The scarcity of this sort of information is easily compensated by the depth of Sharma’s treatment of Gandhi’s psyche: his thoughts on sex and celibacy, British imperialism, his own spiritual heritage, and the caste system are just a few of the topics he covers.

The book draws me to one conclusion: other modes of biographical writing aside, a spiritual biography on the life of Mahatma Gandhi cannot fail to testify to God. Absent the divine majesty, Gandhi’s intentions no purpose, his actions had no end, his thoughts and no object. Absent God there could be no Mahatma.

Monday, July 29, 2013

Reza Aslan's Zealot: The Life and Times of Jesus of Nazareth (Updated)

Let's take a break from the blog series on Isaiah and talk about Reza Aslan's Zealot: The Life and Times of Jesus of Nazareth (2013). Since an interview on Fox News with a host who was not able to get past the idea that a Muslim wrote writing on Christianity, Aslan's book has sold briskly on Amazon. Not that it was doing poorly before; only now it is at the top of the charts.

On my shelf is sitting his No god by God: The Origins, Evolution, and Future of Islam (2006), from which I developed a healthy respect for Aslan's acumen. It seems appropriate that a Muslim should write a book on Jesus, since Islam claims Christ as a prophet who brings the Gospel, like Moses brought the Torah, David the Psalms, and Muhammad the Qur'an, affirms his virgin birth, and proclaims his return on the Last Day. Only the orthodox Christian formulation about the two natures, divine and human, in one person is absent from in the Islamic account. It betrays the ignorance of the Fox News host, and everyone else who thinks what Aslan wrote is fundamentally objectionable, to suggest Aslan has absolutely no business writing on Jesus. Unlike persons, religions are not discrete entities; they overlap, interweave, and mix in the heads of persons down through the course of human history.

But Aslan has taken a lot of heat from certain quarters for his newest book. Understandably, though regrettably, conservative Christian quarters in the main. The most intelligent criticism I have read so far comes from First Things blogger Matthew Franck, who points out 'Reza Aslan Misrepresents His Scholarly Credentials'. I say 'intelligent' because the article is more than mere opinion. The author did a little bit of digging around to develop the piece. But the argument may not be entirely fair. Franck places more value on form rather than content, on the external things, which should only be regarded as of secondary importance. He argues Aslam misrepresents his scholarly credentials, and therefore we should doubt his contribution to the broader conversation is the implied suggestion. But scholar who spends his life reading texts about religious beliefs, writings books on religious topics, ought to qualify as a scholar of religion, in my estimation, regardless what his current academic title is or what his dissertation is on. Franck disagrees. You can read his piece for yourself and form your own opinion.

The obvious point to be made, apparent in the title of book, is that Aslan's Jesus is not the Jesus of the New Testament Gospels. The Gospel are fairly careful to distinguish the sort of messiah Jesus was supposed to be from other Jewish claimants to messiahship around the same time. Jesus' kingdom is not of this world. The kingdom of God is within you. Render unto Caesar what is Caesar, and God what is God. And so on and so forth. Short little catchphrases may be found throughout the Gospels, all of which relativize the importance of transient worldly success. (It is transient, after all.) The message of the Gospels is subversive in a bend-over-and-take-it-on-the-backside kind of way. Jesus ends up going to die on the cross--willingly.

The inch-deep, mile-wide cultural commentary ought there misunderstands that Aslan's basic hermeneutic for reading the Gospels does not come from Islam, but from 19th century European seminaries. Very intelligent theologians, for reasons peculiar to the place and time, decided the Gospels' portraits of Jesus were not historically reliable. They drew a fundamental distinction between the Christ of faith and the Jesus of history. Believers could believe whatever they wanted. On the other hand, scholars had to restrain themselves from saying anything beyond the surface of human history. This scholarly attitude lives on in such organizations as The Jesus Seminar.

Aslan's Jesus is a rebel of sorts seeking to effect some worldly change. So Aslan rejects the final implications of the Gospel portrait. As he says in the opening pages of his book, 'If we expose the claims of the gospels to the heat of historical analysis, we can purge the scriptures of their literary and theological flourishes and forge a far more accurate picture of the Jesus of history.' Divide faith from history, like Aslan does, and the willingness of the historical Jesus to go to death in order to effect a victory, not over mere human powers, but over sin, death, and the denizens of hell, no longer makes much sense.

There is nothing new in Aslan's arguments, and certainly nothing worth loosing our heads over--nor compromising our resolve to love our neighbours as ourselves, even and especially when they may disagree with us. There is nothing especially offensive in his presentation either. It is entirely in line with a Christian confession of belief that Jesus Christ was the Son of God that a non-believer doesn't believe the same. Whether Aslan has mined the Gospel for all the viable 'historical material' that can be had from them--well, that is another question.

Sunday, June 30, 2013

The External World

Among the perennial questions of Western philosophical tradition is one about the existence of the 'external' world. In its most basic form, the question asks, Do things exist apart from our thoughts about things? Is it true, in other words, that to be is also to be perceived?--to borrow a phrase from the 18th century Anglo-Irish philosopher Bishop George Berkeley. Does the tree exist because you think about it, or does it exist prior to your thinking about it?

But the answer to the question is not as obvious as it first appears. The longer you think about the question, in fact, the more obscure the it becomes.

We have good reasons for thinking things exist apart from our thoughts about them. To begin with, we fall asleep at night and wake up to find the world much the same as we left it. We travel familiar routes to home, school, or work, navigating by means of familiar landmarks. The continued presence of objects in our physical environment provides a very strong reason to think they exist apart from our conscious perception of them.

As far as a naive faith in the external world goes, philosophers seem the worst of the bunch. They can always be found talking about Kant's view of this or Heideigger's view of that, as if Kant and Heidegger, and their views of this or that, were out there waiting to be looked at, thought about, and discussed at great length. Which, of course, they are--recorded for us in books.

We are very comfortable with the thought that an external world exists apart from our thought about that world. It helps us make sense of learning, discovery, and being in error. Something 'external' to our thinking provides a standard against which to measure the truth of our thought. Our thought runs up against it, tries to comprehend it, arrives at a provisional understanding, makes a decision as to its adequacy, and so on. We presume the existence of an external world whenever we communicate our thoughts with others. At least, those of us do who have not yet figured out how to communicate directly, one mind to another. Not only do we make use of the external world as a medium of communication, much of our communication has to do with calling others attention to consider some object found there.

Not everyone is happy with the language of an external world, nor the implied idea that the world is one thing and thought about the world another. (The aforementioned Kant is a good example.) The philosopher Daniel Dennett has coined the term 'Cartesian theatre' to capture how strange the idea of thinking about the world as external to ourselves. The most obvious reason for why the idea just doesn't measure up, of course, is that we find ourselves in the external world: our bodies. We are, in some very real sense, our bodies. As our bodies move, so we grow. As our bodies grow, so we grow. Where our eyes look, our conscious attention seems to follows--or does it lead? Dennett enjoys mocking persons who think of themselves as looking at themselves (their body) from an undefined location (their mind). The mind is not the brain, after all. The brain is something that can be seen, picked at, pulled apart, sliced into sections. The same cannot be done to the mind, per the definition of mind. But if it can't be observed and studied, it seems legitimate to wonder whether the thing exists at all.

I haven't much time for Dennett's endless refusal to say anything positive about this thing I call myself, though I find his line of questioning to be a helpful foil. Thomas Nagel has it exactly right when he says that Dennett merely redefines consciousness as an external property, ignoring the essential problem, which is the subjective first-person perspective that each of us occupies, and no one else does for us. Indeed, it's the individual's first-person perspective (which, if re-ified, is called an immaterial soul, the life of the rational animal) makes the external world a problem in the first place.

The individual first-person perspective throws a monkey wrench into any abstract formulation--whether it's Berkeley's to be is to be perceived or Dennett's critique of the 'Cartesian theatre'. Certainly the logic of these positions can be tried and tested; but logical analysis aims at universal applicability, which is precisely not a first-person perspective. If the world exists only because I perceive it, the rest of you have a real problem. Likewise, if a first-person perspective is nothing, or at least nothing worth thinking about, then we, each individually, all have a real problem.

Bishop Berkeley had an answer. To be can still be to be perceived, even if no human being is perceiving every single object in external world all the time claimed Berkeley, because the being we call God perceives everything, which allows them also to exist apart from partial human perspectives. That not a solution open to Dennett, at least not one he thinks is open to him. So he runs away from the first-person perspective; and, we might say, trips over the elephant in the room--himself.

The idea of a world external to ourselves, it seems to me, helps us all make sense of our individuality. It allows me to say your perspective on things may differ from my perspective on things by creating a buffer zone between the part of me only I have access to and the part of me the rest of the world gets to see. You are external to me. We can talk things out, but we won't necessarily come to an agreement, or even an understanding. And that is okay.

Saturday, June 29, 2013

Richard Fletcher, Historian

Richard Fletcher was a rarity among historians. A medievalist, Fletcher published books on Anglo-Saxon England and Moorish and Christian Spain prior to the actual beginnings of the Reconquista in the 11th century (which is usually dated to the 8th century). Another of his impressive scholarly accomplishments was The Barbarian Conversion (1999), which looked at Christian missions into the dark heart of Europe between the fall of the Western Roman Empire and the Reformation, with an eye to happenings in the Eastern Roman Empire, the Middle East, and North Africa. It is not hard to imagine Fletcher thought himself picking up where Edward Gibbon left off, only with a much less jaundiced eye towards events and persons who didn't obviously exude the material greatness and organizational power of the Antonine Dynasty.

It is my experience that history books can all be arranged on along a single axis stretching from a purely objective perspective on the historical subject matter to an investigative perspective that gives readers a glimpse of the difficulties historians encounter trying to interpret their sources. Most historians fit into the former category. They may talk a good talk about the multiplicity of perspectives from which the sources can be studied; but they rarely reflect on the limitations imposed on historians by the limited availability of materials. History textbooks assigned in undergraduate classes, as well as most survey texts, fit into this category. They tell what happened when, and why things happened the way they did. Narrative threads are woven together presenting 'the present state of the field of study'. Specialized historical studies also follow this general pattern. In their introductory chapter, the historian usually tells you what other historians have written, what new evidence they have found, and how it confirms what we have already discovered or how it should radically change how the field of study is understood.

Fletcher's Bloodfeud: Murder and Revenge in Anglo-Saxon England (2002) is one of those rare histories that let's you follow a historian reading texts, trying to discern where all the pieces fall. The roughly half-century stretch of time from the establishment of Anglo-Saxon rule in 577 until the Norman Conquest in 1066 comprises the England's participation in the Dark Ages. The earlier in the period one finds oneself, the more scarce the evidence becomes. Though in the last leg of the period, from the Danish Conquest in 1016 until its conclusion, much is left to be desired.

The northern-most English province Northumbria was ruled by Earl Uthred, celebrated with the title 'the Bold'. In 1016, Uthred came to pay his respects in the court of the Danish king Canute (or Cnut) at a place called Wiheal. The location of the meeting, Fletcher indicates, is part of the mystery. We don't presently know where it is. Uthred and forty of his clients and retainers died that day. His death set into motion 'a bloodfeud that lasted for three generations and almost sixty years'.

Bloodfeud patiently sifts through what evidence remains in an effort to discern the motivations behind the different persons involved. Sometimes all that we have to go on are single sentences carelessly dropped into The Anglo-Saxon Chronicle, a document contemporary to the period in question. Often we are drawing imaginative inferences from what we known generally about what life was like from more general studies of comparable materials drawn from elsewhere, what sort of commonly accepted rules bloodfeuds were prosecuted under, and so on.

Fletcher's gift was to convey the difficult constrains any historian, especially those who work from such a great distance in time, must work under. The gift is rare. The problem I want to think through is why the gift is rare.

A few of reasons come immediately to mind. The first is that most people are not trained as historians. Those who do pay some small amount of attention to the human past by reading survey texts or specialized studies are more likely to assimilate the historian's conclusions than they are the historian's experience coming to those conclusions. This occurred there and then; or this happened because that happened; but not our lack of certainty on this or that point. The second is that the immense amount of materials published on any one place and time in human history is likely to shore up erroneous assumptions about just how much evidence is available. Readers don't necessarily contemplate the fact that single lines in an ancient text can generate exponential growing amount of commentary, none of which can get around the simple problem of a lack of additional evidence needed to corroborate this or that interpretation. The third is that a majority of people, if they are interested in the past at all, are more likely to be interested in the recent past. And it is precisely in the recent past, especially the very recent past, that we encounter of glut of material evidence.

Put together, these give rise to what I will call an 'empirical fetishism'. For every question, there should in principle be an answer. If there isn't an answer, we allow ourselves to hypothesize about a 'best fit' answer. Empirical fetishism means that our knowledge of the world ought to be a seamless whole. We don't like holes in our seamless whole, so we fill them. Fletcher points out that the village of Wighill has been suggested for the location of Earl Uthred's murder at Wiheal, along with a number of other candidate whose name begins with W. Wikipedia names Wighill as the location of his murder, in fact, but without any comment on the interpretive dilemmas of identifying this particular place with that particular name in that hoary tome. History it seems, like nature, also abhors a vacuum.

Let's not make fun of Wikipedia on this point. They are only doing what most everybody else does in their situation: drawing conclusions, filling in blanks. Because of the impossibility of constructing a consistent account of the whole body of our knowledge about the world and its past. empirical fetishism itself gives rise to perspectivism. Everyone has their own perspective on things. You can think about things in as many ways as you want, of course. The interpretation of the human past, even the immediate past, but especially the distant past, however, often leaves a person with nothing to have a perspective on. That sort of empirical sensitivity is why we need more historians like Richard Fletcher, as it's very easy to assume a perspective on things can replace due attention to the things themselves.

Thursday, June 27, 2013

Religion and Canadian Secularity

The very few people who find their way to this page (like my father; or Tyler, who may find later part of the article interesting because it reflects to very different legal cultures) may be interested to read an essay published by an old classmate and housemate of mine: 'Bringing Religion into Foreign Policy' by Robert Joustra.

The essay is a compressed version of Rob's thesis on the public debate around the establishment of an Office of Religious Freedom in Canada's Department of Foreign Affairs. The Canadian discussion has been divided between two rival versions of secularism: Laïcité and Judeo-Christian Secularism. The former sees secularism as a rejection of a role for religion in politics, while the latter sees a secular political sphere as the special creation of a Judeo-Christian outlook on life. What happens when an office of religious freedom is  charged with monitoring religious freedom around the world as part of Canada's foreign policy? Obviously the two secularisms come to blows. Laïcité secularists gets uncomfortable about the an overlap it thinks shouldn't exist. Judeo-Christian secularists sees oppourtunities to promote their own Judeo-Christian outlook on life. The story is naturally a little more complicated than I just described. Rob does an excellent job detailing the holy mess. He concludes with a quote of Jacques Maritain discussing how the committee drafting the Universal Declaration of Human Rights were able to achieve consensus despite being of very different intellectual persuasions. If any of this interests you, the article is well worth a read.

My quibbles with the article are a little more esoteric. The meaning of terms like religion and secular has changed over the centuries, Rob rightly points out at the beginning of the article, and that's where things begin to fall apart. It's seems that Rob has adopted the problematic phenomenological language of 'neutral' description of 'historical entities'.

Here are a few of the examples:
  • Scholars need to do 'a better job of making sense of a thing called religion'.
  • 'These efforts to engage with religion are motivated by the misguided belief that the inclusion of religion encourages a more peaceable global order.'
  • 'But it does not necessarily follow that there is no historical thing as religion and its freedoms.'
  • 'Only once we clarify these meanings can we decide if we are prepared to truly acknowledge religion’s contested nature in the structure and aims of our foreign policy.'
A few points to note: religion is a thing, religion has the capacity to encourage, religion has a contested nature. Turn these phrases over in your mind a few more times. The more you think about them, the stranger they get. Religion is a 'thing', but I bet no one has ever seen it. Religion encourages peace, which seems to imply that is has a capacity to act like a human being acts. Religion has a contested nature--but then again, so do most invisible entities you can't see or touch, but encourage you to love your neighbour and seek the welfare of your fellow human beings.

While Rob does a really good job dissecting different versions of secularism, he doesn't really clarify what he means by religion. If I was in an uncharitable mood, I might venture to suggest that the initial talk of religion being an essentially contested concept is just a smoke screen to cover over...something. I am not sure what.

Rob himself uses the term religion in two distinct and readily discernible ways. The first is as a way of talking about communities. When politicians want to engage with religion, that usually means engaging with religious communities through their clerical leadership. Whenever we talk about religions in history, more generally, we aren't actually talking about something called religion, but about a profoundly powerful way of organizing communities. The second is the way people think about themselves in relation to other things, other people, and ultimately to the world at large.

Instead of telling his readers how he uses the term religion, why does Rob hide behind postmodern rhetoric about the fluidity of meaning? I mean: anything is better than talking about a spooky something that you can use to make people do things, which is the upshot of calling religion a thing, trying to figure out its uses, what it does, how it works, and so forth.

My suggestion is that Rob is actually a Laïcité secularist. My evidence? Talk about the humanity is incredibly sparse. In Laïcité secularism, religion is something added onto human nature in the course of human history, and can be excised from human nature via the application, for example, of scientific thinking. So the Laïcité secularist talk about religion as if it were an object out there that human being can talk about, but which has no necessary relation to the human being.

And that's exactly what Rob does. As I pointed out above, it's obvious he is talking about religion either as a form of community defined by clerical leadership or a way persons think about themselves in relations to the wider world. But he doesn't draw those conclusions. My best guess why he does not do so is because he thinks about religion as something extraneous to the human being. If he had thought about religion in relation to humanity, he wouldn't be hiding behind the smoke screen of an 'essentially contested concept'--which, I note, is exactly what it seems to be: an idea in a person's head, a way of thinking about things, and maybe even a way of thinking about ourselves in relation the rest of the world.

My conclusion: Rob is practically an atheist. Emphasis on the practically. Cheers, Rob.

Wednesday, June 26, 2013

Catholic Ecumenism

I was reminded of how difficult it is to determine what motivates people to do the things they do while reading an article in the May/June issue of Foreign Affairs on 'The Church Undivided: Benedict's Quest to Bring Christians Back Together'. The author Victor Gaetan does a very fair job describing the Catholic Church's idealism, it's desire to be reunited with old friends and foes alike. He describes Pope Benedict's deliberate steps towards reconciliation with the Lutheran, Anglican, and Eastern Orthodox churches. The perception of a papal misstep in the now infamous Regensberg Address, in which the pope appeared to disparage the Muslim faith, is ably shown to be faulty.

In Gaetan description of Benedict's papacy, something of the spirit of the Renaissance Cardinal Nicholas of Cusa's idea there is only ever one religion for all human beings, even if the rituals are various and sundry. A concluding comment on the tenor of Francis' papacy sees more of the same.

What (for lack of a better term) caused or 'gave rise' to these new efforts towards ecclesiastical reconciliation--and even inter-religion understanding? This is where the narrative gets a bit thin. Gaetan notes two contributing factors to the contemporary rise in ecumenism: the need to respond to the marginalization of religion in a secular age and a shared sense of vulnerability in response to an escalation of violence.

It's at this point I scratch my head. Is that the only two things that could be mentioned? Both are external causes acting on the Christian community, forcing it to react to a new situation. Like the theory of natural selection, they are environmental factors determining the development of the social organism. That means internal motivations, like a common confession or the biblical testimony about the desirability, aren't treated on par with the external causes, which would surprise Benedict and Francis, and their counterparts in the Lutheran, Anglican, and Orthodox communities.

Gaetan's explanation also seems to me to be too narrow. Once the importance of internal factors are discounted, or at least demoted in order of importance, those broad inter-generational shifts that sweep everyone up as they make children collectively doubt the wisdom of their parent,s also get missed. Of course, Gaetan is aware that the age of polarizing ideologies has been over since the Berlin Wall fell in 1989. The story of John Paul II's papacy cannot be told without reference to that epochal event.

But something I have notice while reading works published through the 20th century and growing up in the last two decades is that we no longer take our ideas as seriously as our parents and grandparents once did.  We are no longer idealists in the high modern sense of the word. And this has a number of obvious consequences. We no longer think of communities as wholes to which we belong. We don't love abstractions like humanity or a nation like we used to. The boundaries between us and them are being recast on different social fault lines. Where those will lie is not yet clear. Certainly socio-economic divisions, for example, in North America, will be more pronounced than they were in the middle of the 20th century.

That sort of shift in attitude cannot fail to effect on the broader Christian community. In my own life, I witnessed the bottom fall out of a belligerent indifference to other Christian denominations in the Christian Reformed Church. The more Evangelical among the members found common cause with a wider Evangelical community, while the more intellectually inclined became more sympathetic to the Roman Catholic and Orthodox Churches. Some regretted the loss of theological distinctiveness; but that only confirms my thesis that we no longer take our heady ideas so seriously.

I am not claiming we have stopped thinking, only that we think differently about ourselves. It seems to me the   papacy's push for ecclesiastical reunification makes sense in the context of our mounting disillusionment with old intellectual idols. The election of pope like Francis, a pragmatic servant of the people, is entirely of a piece with the intellectual climate.

Sunday, June 23, 2013

My Discovery of the X-Files

I missed the X-Files in its hey-day. The nine seasons running from 1993 to 2002 corresponded almost exactly with my teenage years. But I was too busy watching Star Trek: TNG, DS9, and Voyager, too busy reading the classics of science fiction and fantasy, or quickly paging through the latest literary addition to the Stars Wars universe. Though the literary quality of the latter, it needs to be said, went quickly downhill after Timothy Zahn's Thrawn trilogy.

For the longest time, the X-Files lay just over my cultural horizon. Until this summer, actually. Netflix offered a free month subscription a week before its installment of a fourth season to the Arrested Development franchise went online. Watching the new episodes took an effort three or four days, which left the greater part of a month on the subscription. The X-Files was on my recommended list. Netflix had followed the path I wandered through its offerings of movies and television shows. By the infinite wisdom of a selection algorithm, I discovered the truth really is out there.

I am now almost through three seasons. The show exercises a strange sort of persuasive power over its viewers. Which speaks to its quality, since it can no longer fly on its innovative cinematographic techniques alone.

The backstory has the US government continually suppressing evidence of extra-terrestrial life. Each episode uses the pretext of an FBI investigation to chase some conspiracy theory down a rabbit hole. The name of the show is taken from the name of a supposed FBI office of investigation. The names of its only two assigned agents, Fox Mulder and Dana Scully, have found their way into a grab-bag of references that help us navigate webs of cultural significance.

I admit I did not quite get the show until the scriptwriters used the latter part of the 2nd and 3rd season to develop Dana Scully's Catholic background. Until that point Mulder's willingness to entertain the strangest of explanations played off against Scully's rabid faith in empirical explanations. Between the two characters, the limitations of methods of scientific study were poked an prodded. The point, I take it, was to show how credulity is an attitude a person takes to the evidence, not something produced by the evidence.

With Scully's Catholic background a possibility for comparison opens up considerably. We discover Scully's willingness to believe the sorts of things faith required--but believe on faith, which means an open, questioning attitude towards the things required by faith.  Whereas Mulder believes the sort of things the Ockham's Razor allows him to believe. In the absence of definitive evidence, the simplest explanation may not be the expected terrestrial answer.

Written through the contrast between the two  is a fairly profound disagreement about the nature of human intelligence. What standard is it measured against? Measure the human intelligence against a divine standard, Scully's anthropocentric convictions follow as a matter of course. But if the divine standard is absent, Mulder's speculative suggestions become much more plausible. With a God above, the human being finds meaning within themselves. Without God, we want to look further afield. 

The genius of the X-Files is to leave the viewers to decide whether what they saw was real. (Yes, I know, aliens all but parade across the television screen.) Even when the existence of actual alien life is all but confirmed with an appearance on screen, the viewer can still take a credible repose in Scully's skepticism. Aliens might instead be the unfortunate victims of genetic experimentation. The suggestions are not always made explicit. They do not need to be. Scully's incredulity makes it possible to question what you believed you saw.

Most of the way through the third season, I am not eager to discover how the show gradually declines into mediocrity after the fifth season. 

Saturday, May 04, 2013

The George W. Bush Presidential Library and Museum

Last week saw the dedication of the George W. Bush Presidential Library and Museum, situated on the campus of Southern Methodist University. The project is the result of one half billion dollars in fundraising. Its dedication was attended by every living president, from James Carter, through a wheelchair bound George H.W. Bush, to a spry, and comparatively young, Barack Obama. For a brief moment, it occupied national and international attention, with most major American news sources adding their particular take to a very well worn story. Here is a sampling of journalistic fare from the Washington Post, Foreign Policy, Mother Jones, the New York Times, and The Atlantic.

Reporters seem to have gravitated towards telling one of two stories. The first looked at the "Obama angle". Yes, the current POTUS was in attendance, and also was able to set aside ideological bickering across parties for a very brief moment. This story is favourable towards Obama, but rides on the idea that the Office of the President stands above the Washington fray. Bush escaped largely unscathed. The second story looked more closely at the content of the library and museum. The library and museum will house around 43,000 artifacts and millions of documents from the 43rd president's tenure. Reporters chomped at the bit of demand for journalistic objectivity to raise obvious questions about whether the history told the library and museum will in any way reflect reality. The Mother Jones article linked to above lists eight things you won't find in the new library and museum facilities--the eighth being evidence of the existence of WMDs of Saddam's Iraq. (Because there wasn't any.) In this second story, Bush assumes the form of an object of scorn.

Of the two stories, I find the second is intrinsically more interesting. With the first, we get to watch the great game being played out in a highly controlled environment. Instead of being allowed to criticize your opponents explicitly, politicians have to score points by appearing to play nice. Obama and Bush are cast as figureheads for much larger trends in American society. Their ability to play nice for a brief moment is indicative of the fading memory of a common destiny for all Americans. Compared to the immediacy of the first story, however, the intrinsic charm of the second story is found in its impoverished rendition of Winston Churchill's audacious claim, 'History will be kind to me for I intend to write it.'  Churchill wrote rather glowing accounts of the role he himself played in WWII in a six-part book series on the conflict. Consider the many hours invested in literary production alongside the achievements of Bush, who has taken up painting. Bush solicited wealthy friends to put up millions of dollars to have someone else do what can only be described as whitewashing a rather tarnished public image. Intellectually lazy only begins to describe this latter-day attempt to resuscitate a legacy.

My own take own the efforts of Bush and friends will be obvious from the tone of the preceding. The outrages of the GOP alumni against historical scholarship, however, interest me more as a illustration of larger problems associated with the interpretation of human history than they do as examples of individual failings. The default assumption, or what can best be described as the common sense way of thinking about things, is that the study of human history gets at something objectively out there waiting to be discovered, in the same way that the fundamental features of matter or new species of animals are out there waiting to be discovered. Hence Bush is confident that the 'facts', maturing with the passage of time, will reflect well on his tenure as president. Once all the facts are known, or have come to light, or what have you, they will show him in a much better light than his detractors are presently willing to admit. Not unsurprisingly, those detractors are convinced the same set of facts, in due course, will prove otherwise.

Which raises the question, What is meant by the term 'fact'? The dedication of a temple of Dubya's prowess raises questions about whether and in what sense the study of human history, or the study of the humanities more generally, is comparable to the natural sciences, like physics, chemistry, or biology. Is the historian's object, say, Bush's tenure as president, objectively available in the same sense of the natural scientist's object is available? This is not something scholars and scholars spend much time fretting about. The construction of the modern university discourages comparison between academic disciplines (even as it encourages something called 'interdisciplinarity'). Both groups can go about their day without giving much thought to where they stand vis-a-vis the other.

The fundamental criteria for factuality are that the object in question be observable such that others can verify what was observed in order to confirm the success of a theoretical framework framework to account for what was observed. The theory of evolution one such framework, within which is organized the relationships between different species of animals--or 'the facts' derived from the observation of fossils and living organisms. On this definition of factuality, Bush's tenure as president fails one of the fundamental criteria of factuality. While there is initial observation of the object, there is no possibility of verification. Bush's tenure is a one-time unrepeatable affair--thankfully.

But so is the evolution of this or that species of organism, would seem to be the obvious objection. That's true; but there's a second consideration will further complicate the comparison The material evidence for evolution and the material evidence for Bush's tenure as president are fundamentally different. Scientists theorize about processes operating in biological materials independently of creative human input. Human beings don't guide the long process that lead to the evolution of human beings. The material evidences for the evolutionary process is there to be studied, theorized about, maybe even interfered with, tweaked, 'improved' upon--but that's all. The historian, on the other hand, studies a body of material evidence that could have no existence apart from creative human input. It is impossible to conceive of all those textual, audio, visual artifacts attesting to Bush's tenure as president arising through non-human agencies, which is what evolutionary processes are. The historian never escapes the circle of humanity.

So I raise a Shakespearean equivalent to the middle finger and call a pox down on all their houses. How Bush's tenure will be judged in the long term won't come down to something called 'the facts', however broadly or narrowly that might be interpreted. Bush and friends need to take a page about the inevitably that future generations will judge you out of a Confucian playbook. Future generations will judge your actions, not against a set of objective facts, but for your humanity. If, to cast it in terms of extremes, you were a tyrant or a dandy, don't expect to be looked favourably upon. If the suffering of the mass of humanity increased under your watch, don't expect to be lauded. It one of the features of the interpretation of human history that the next generation is not likely to agree with your own assessment of yourself, especially if its trumped up way out of proportion. And if the next generation doesn't, then the generation after that will--or the generation after that, and so on, and on, and on, and on.

The lesson is that one does best measuring oneself against one's fellow human beings than against some objective standard or abstract goal. We are all, every one of us, in this together.

Monday, April 22, 2013

Teaching Philosophy

The study of philosophy, or the study of what other people have said that gets categorized under the heading 'philosophy', has the potential to leave one feeling a perpetual student. Expertise can be acquired in a so-called "field" of study. Along the way, however, students will have realized their "field" of study looks less like an actual field and more like a narrowly defined section of a bookshelf in a stuffy library. They will have also realized that one never finishes studying philosophy.

It's an interesting thought experiment to put the shoe on the other foot. What if you had to teach philosophy? Where would you start? Any student can give a non-committal answer questions about what they are studying. They are on their way to knowledge, and, in any case, there is too much to capture is a single phrase. On the hand, a teacher lacks that bohemian luxury. They must say something

The first thing I would do is set aside any latent Socratic inclinations. The Ancient Greek philosopher Plato cast his teacher, the famous Socrates, in the role of questioner, leading his conversation partners to the realization of truths they already knew, but had quite been able to articulate on their own. My approach would include a discussion of a famous philosophical text. We would have never known Socrates, after all, unless Plato of cast him as a character in his philosophical dialogues. Regardless how much talking philosophers do, they are always falling back into texts, each of which is going to preserve a small portion of a tradition of reflection that never quite comes clearly into focus.

More to the point, then, which text would you start with? And let's say, for the sake of argument, it could only be one text. Not one thinker. Not one philosophical school. Not one series of texts. One text.

My choice would be Rene Descartes' Discourse on Method (1637). There are a few reasons why.

First, the text is relatively brief. It contains six chapters in total, all of which can be digested in a single sitting, or over a number of sittings without much difficulty..

Second, the purpose of the text is to take a position on anything in particular, but to introduce the reader into a way of thinking about things. You are invited to follow Descartes on his philosophical journey, to think through why he came to the conclusions that he eventually did.

Third, Descartes' philosophical observations are woven through a personal narrative allowing for historical commentary. That means a teacher can put "flesh" on the bones of the argument, placing arid speculative suggestions in a more recognizable human context.

Fourth, the Discourse contains recognizably contemporary intuitions about the nature of things. It takes the form of a personal narrative. It is playful experimental with the ideas it presents. It wonders about the nature of the human self. And, most importantly, God has been displaced from the center of inquiry. The entire world of human experience is no longer assumed to come from God and return to him. (It still does, of course. Descartes holds God to be the Creator of all things. He is not willing, however, to start with that as the presupposition of his inquiry.)

The basic argument of the Discourse is that all those things Descartes had formerly held to be true he had discovered many reasons to doubt. His experience of violent and destructive discord among different Christian sects during the Thirty Year's War had lead him to seek a more certain basis for knowledge. The revelation of God could not be trusted, as it was mediated by human beings. The same went for the teaching of the schools, by which was meant the abstruse logic-chopping arguments of the late medieval world.

So Descartes resolves to doubt all that can possibly be doubted, and in the process doubts no only what other people have told him, but also what his own eyes and ears "tell" him. He even suggests, for the sake of experiment, it is possible to conceive of oneself existing without a body. After "emptying" his mind of all those doubtful thoughts, Descartes arrives at the one conviction he cannot shake: I think therefore I am--"that is to say, the mind by which I am what I am, is wholly distinct from the body, and is even more easily known than the latter, and is such, that although the latter were not, it would still continue to be all that it is."

All of this is ripe for discussion. Is the format of philosophical dialogue effective in conveying the author's intention? To what extent is one's thinking conditioned by one's lived circumstances? What would it mean to begin one's thinking with God rather than oneself? and vice versa? Is it possible to doubt everything? Can we ever be certain of anything? What might it mean that our minds our completely distinct from our bodies. Does any of this even makes sense?

The closer you look at a text like the Discourse, the more perplexing it becomes. Descartes is perhaps best described as not quite modern enough. Which makes his Discourse the perfect choice to begin teach philosophy.

Does anyone else have other suggestions?

Friday, April 19, 2013

Menial Labour

I am no stranger to what is elsewhere called 'menial' labour. Growing up in rural Ontario, my first jobs were both physical and monotonous. The same tasks had to be performed day in and day out. The jobs were, almost without fail, dirty jobs--especially when I was cleaning things. I was good at these menial jobs. I wasn't great at them. I could perform adequately the tasks required of me, though I was unlikely to perform them expertly or to take much of my own initiative. The sorts of thinking required to see solutions to very rural and/or blue collar problems was not in my possession.

I also have exposure to white collar 'menial' labour. The most recent bit of experience I can cite comes from last night invigilating a chemistry examination. I thought I would try invigilation out this year, so I threw my name into a pool of potential hirees. A single hour and a half training session a week in advance and a 15 minutes pep talk before the exam was supposed to make our tasks straight-forward and obvious. Then a 25+ person team was sent in a number directions, with an examination 'package' in hand, but more or less without support.

Sent to the room with the examination package, complete with examination papers and instructions for their distribution, as well as a half hour to spare, I realized immediately that someone getting paid a lot more than me had failed to assign the necessary second person to the room. Unable to raise my supervisor on the phone, I started prioritizing tasks. The examination 'circulator' eventually made their way to the room that I was in and realized much the same thing. For some reason, though, it was my fault that things weren't getting done the way they were supposed to get done.

A second person was sent to the room, a half hour after the examination had started, which had been delayed by ten minutes. Having been told repeatedly to follow every step on the invigilation instruction sheet, I relished the oppourtunity to cut corners where corners could be cut. It wasn't my fault, you see. I did the best with what I was given. If my best wasn't good enough, don't blame me for doing my best. Blame my superiors for their incompetence.

This most recent experience with white collar 'menial' labour impressed upon me the dreadful impenetrability of bureaucratic structures, in particular that of those in immediate authority above you. The experience also raised some questions, in my mind, about the exercise of authority is so proceeds so differently in a rural and blue collar world from a white collar world (though my observations would also apply to highly structured factory environment).

As I said above, I was a good worker, but not a great worker. Those persons who I worked under, whether that was in farming, landscaping, moving, or construction, seemed to understand as much. I put in long days of work, and only once or twice over the course of a decade remember being belittled for a failure or mistake. More to the point, those persons with whom the responsibility ultimately laid usually went about fixing the mess that I had made without too much complaint. There is a certain inevitably in mistakes, was the guiding sentiment. Try to prevent them, but deal with them as humanely as possible when they do happen.

I was surprised how vigorously my supervisors made it plain to me that their failings were ultimately my responsibility. There is a certain rationale for doing so, of course. In the moment, I am the one who has to perform in order for their program to be put into action. But the bureaucratic structure falls to pieces when those in charge fail to anticipate an obvious problem and also vigorously protest the smallest exercise of independent judgment in the matter. The bosses not only think you are stupid and incompetent. They treat you like it too.

Why the difference between these two sorts of bosses? It may be that what I am describing is merely a function of the size of the organization. But I have to also think it is a consequence of the sorts of materials being worked on. In the rural and blue collared trades, you work with particularly stubborn, resistant, and in every case also non-rational materials. Fields of wheat do not rebel against you, nor skids of lumber and brick talk back at you. Persons assigned to do a specific task, in highly structured, rationalized processes, on the other hand, are expected to comprehend and implement a set of instructions in very short order. They are also instructed not to think for themselves, which, if something should go wrong, has a real potential to allow things to go from bad to worse in a very short order.

So I wonder if facing stubborn non-rational resistance necessarily inculcates a very different sort of response from bosses than does facing the apparent irrationality of menial wage labourer in a highly structured working environment. Why do we expect different from persons than we do from the non-human sorts of materials that we work on? Arguably, human materials are more difficult to shape to our wishes.

Tuesday, April 16, 2013

What is Philosophy?

I have a few moments. So I want to ask the basic question: What is philosophy? Instead of answering the question, though, it might be useful to reflect on how the question might be answered.

It seems to me we have two basic options, whether due to the limitations of language, or cognitive capabilities, or both. We can say philosophy is what it is (that is, philosophy, which doesn't get us very far at all), or we can define it in terms of something else (e.g. the love of wisdom, critical thinking about X, Y, and Z, etc.). These two possibilities represent relations of identity and relations of difference. They are probably best termed strategies for analysis, not necessarily methods for getting at the truth of things. Relations of identity presuppose differences between identities, and relations of difference presuppose identities which are different. When I ask, What is philosophy? simply by using the word, I bring along a host on more or less (as yet) uninterrogated meanings.

If wisdom is identified with the divine and philosophy is the love of wisdom, for example, then philosophy is also the love of God--which raises questions about whether philosophy and theology and/or religion are so different. If critical thinking is identified with an inquirer ready to question every possible assumption, then philosophy is allied with critique, doubt, or skeptical stance towards knowledge claims--which raises questions about whether philosophy has anything in common with the dogmas of religion and theology. I don't want to ally myself with either of these definitions. I do want to observe the interrelatedness of definitions with other definitions.

Now, I have a Masters in Philosophy. However, I was warned, in a round about way, by my supervisor not to pursue a Ph.D. in the discipline. The result was that I ended up in Religious Studies, where I am quite happy teaching and thinking about subjects related to religion and its history. The reason given was that my thinking was much too theological in cast to succeed in a philosophy program. That's probably more or less true, though I ended up in a Religious Studies Faculty, not a Theology department, which was my preference.

You see, it seemed to me, for the same reason I wasn't prepared to do a Ph.D in Philosophy, I also wasn't prepared to do a Ph.D in Theology. Everyone was talking (that is, from my naive, undergrad and Masters degree perspectives) about philosophy and theology as if they were objectively describable things to be studied. With regards to theology, that made a small amount of sense, since theologians claim to be talking about something real, something 'out there', which has been mediated by scriptural sources and a long textual tradition of reflection on those scriptural sources. In the case of theology, there is something out there to objectify, something I can point you towards, something we can consider together and talk about.

What about philosophy? There appears to be a textual tradition going back to Plato and Aristotle that can be studied. Though I suspect philosophers prize at least the idea of freedom of inquiry too much to be explicitly tied down to any specific set of texts. One hears it suggested that philosophy is not limited to the study of a certain body of literature, but is a way of thinking about things imparted from teachers to students (much like Socrates was supposed to have imparted his wisdom). That may be the case. Such a definition only distracts from the omnipresent place the study of texts plays in philosophy departments or philosophical armchairs (whereupon the armchair philosopher sits).

At this point, in order to wrap up a blog post that is already much longer than I anticipated, I want to show my cards. I have soured towards the idea that separate academic disciplines (philosophy, theology, history, political theory, English literature, etc.) are as distinct from each other as many of our teachers have supposed. It seems to me that common too each of the so-called separate disciplines is the thinking human being, reflecting on some body of evidence. There is no thought without some object, as David Hume reminded his Cartesian interlocutors  at least none that I am ever aware. The theologian thinks, the historian thinks, the philosopher thinks, etc. They think differently, however, according to their different objects of inquiry.

And it seems to me, if philosophy is anything, it is reflecting on (or thinking about) how we think about things. Full stop. The definition of philosophy needs to be made with reference to the human being who thinks about things, and not some set of abstract definitions. Not, say, the love of wisdom apart from the person who loves wisdom. Not critical inquiry apart from the person who inquiries critically. Not a definition considered at an abstracted remove from the person considering the definition. Rather a person who can say to themselves, I am thinking about things, and that's what I normally do; and when I philosophize, I think about what it is to think about things.

Saturday, April 06, 2013

A/theism and Certainty

Patrol Magazine retweeted an article published last October on the modern history of A/theism. The two words theism and atheism are paired together, the article's argument goes, because modern theism cannot be understood apart from modern atheism, and vice versa. They 'emerged from the early modern world together, as two sides of the same coin', a claim which fits well with the portrait of modern culture painted by the intellectual authorities, including John Millbank (Theology and Social Theory) Charles Taylor (A Secular Age). The contest between theism and atheism in the modern age is presented by partisans as a zero-sum game. The winner must take all and the loser must be vanquished from the field.

The author of the article, Kenneth Shephard, notes a correspondence between late 20th century assessment of modern A/theism and 16th and 17th century attempts to cover the same intellectual ground. For so many of the persons involved in the discussion, theism and atheism go together like transcendent and immanent, each term in these pairing excluding the other, but also presupposing the existence of the other in their own need to exclude something. Atheism needs theism like science needs the straw-man of religion to knock down. Theism needs atheism like good needs something evil to vilify. In this sense, they are like children behaving badly.

Sheppard situates A/theism in larger 'processes of disenchantment, desacralization, and secularization'. Instead of seeing theism and atheism as opposed over matters of religion and science, the better thing to do is observe how theists and atheists make sense of the world as the language of scientific discovery drives fantastical claims from the public square. Instead of demonizing one from the vantage of its opposite, pause and take note of those cultural trends they commonly presuppose. The two sides may talk as if they share nothing in common. Historians like Sheppard, however, know better than to buy into their self-assertive, but partial, ideological perspectives. Where there exists contiguity in space and contemporaneity in time, ideologues are shown to be liars of the first order. All the talk in the world cannot hide the fact that some cultural currency is shared in common.

The analytic framework proposed is a helpful move in the right direction. Once one stops trying to measure the perspective of one's opponents against the measuring stick of History (with a captial 'H"), it should become a whole lot easier to have a conversation--in principle, at least. When the political left inclined towards atheism and political right inclined towards some variety of theism are divided from each other as past is from future, there is very little reason to talk. Conservatives are stuck in the past say the progressives, and progressives have forgotten the past say the conservatives. The measuring stick of History tends to distract from obvious truths: that all of our business with each other is transacted in that shared moment the past is no more and the future not yet called the present.

The purpose of the article, if I have understood correctly, was to do what is termed in very post-modern language creating space for dialogue where 'traditional religious believers, “nones”, and atheists can relate to and work with one another in spite of what can seem like our insurmountable differences.' This is all well and good, and I am all for having a friendly conversation on a level playing field. But the article's argument seems to thrust readers in the direction of abandoning their idols, all those things they hold dear, without actually interrogating why we hold onto our idols with the tenacity that we do.

Sheppard speaks very generally about historical processes, and very little about historical actors. That is a problem, it seems to me, because I have never encountered one operating apart from the other. He speaks very generally about what we believe about the nature of God and the world, and very little about what we have thought about ourselves.

If there is one thing that sets modern A/theism apart from its premodern manifestations, in my estimation, it's an ideal of certitude shared by all alike. The modern atheist rest assured that there is no God because the evidence is lacking, while the modern theist does the same because that's what the Scriptures say. The sorts of evidence to which appeal is made changes, but the constancy of conviction does not. The origins of the certitude might be traced to such luminaries as Martin Luther ('Unless I am convinced by the testimony of the Scriptures or by clear reason...Here I stand. I can do no other.') and Rene Descartes ('This proposition, I think, therefore I am, is the first and the most certain which presents itself to whoever conducts his thoughts in order.'). The exposure of the baseness of all these simplistic appeals to certitude, e.g. in the work of Nietzsche and his post-structuralist disciples, might also be cited, though as proof of just how deep our certainty runs, now that we have become certain of our uncertainty.

So I will take my departure from Sheppard where he suggests we tell 'critical stories' about the 'conditions of our belief'. (Why not build a campfire and bring some guitars?) That suggestion sounds like an exercise in talking around the issue, which has instead to do with whether and in what sense we are certain.

Thursday, April 04, 2013

Muller on Thought and Language

The Ancient Greeks used the word 'logos' to symbolize two sorts of things today we usually keep separate: on the one hand thoughts, and on the other hand spoken words. Not even written words (like these words on the screen in front of you) were regarded as highly as spoken words. Only the spoken word carried the immediate force of a persons thoughts. They carried the force of a person`s soul, their purpose, even their life. Words on the page were dead letters, hollow reminders of things once spoken.

Reading through Friedrich Max Muller's lectures on Natural Religion, just how far our intellectual convictions in the 21st century have wandered from Ancient preoccupations was impressed upon me. More recent figures like Thomas Hobbes and John Locke could have still carried on an agreeable conversation with the Ancients about things that follow as a consequence of the intimate relationship between spoken words and thoughts. Intuitively, I think, we should also be able to recognize what they are talking about. We each have our own 'internal monologue' by which we think through ideas in the form of a more or less broken conversation with ourselves. (Please tell me I am not the only one!) But we don't place the same sort of theoretical value on the distinction between our internal monologue with ourselves and an external dialogue with other people (or with yourself, though that usually attracts the concerned attention of other people.)

Muller establishes, fairly persuasively in my estimation, that no human being thinks without words. Our knowledge of language comes out of processes of socialization, especially early on in life. Knowledge of language allows for the communication of desire, purpose, or query. All of those 'higher cognitive functions' seem to depend on a mastery of language. Now that is not to say that other animals do not cognize and communicate. But what they lack, Muller thinks, is the ability to abstract and categorize, analyze and synthesize--specifically those things that have allowed human beings to cultivate the ground, transform the natural world, build up a civilization, and write books and blogs about it, wondering what it is to be a being that has words--logoi.

The conclusion he eventually puts to his readers is still manages to be something of an eye-opener.
The reason why real thought is impossible without language is very simple. What we call language is not, as is commonly supposed, thought plus sound, but what we call thought is really language minus sound. That is to say, when we are once in possession of language, we may hum our words, or remember them in perfect silence, as we remember a piece of music without a single vibration of our vocal chords...But as little as we can reckon without actual or disguised numerals, can we reason without actual or disguised words.
The first part of Muller's observation is strange enough on its own. It never dawned on me to ask myself whether language was thought plus sound or thought was language minus sound. The comparison itself is intelligible enough. I have thoughts, and you can't hear them unless I speak my thoughts, at which point my thought become audible words. But I never thought the difference might be theoretically productive.

Muller's decision against defining language as thought plus sound in favour of thought as language minus sound is even more perplexing. (Hence I am blogging about it.) The decision corresponds well with the above noted observation that our knowledge of language--and our ability to think--comes through processes of socialization. We don't just make up our own words. Someone, usually parents, teaches us how to use them. The decision also conceptualizes words as objects of study. They are cast as things that we can both look at and think about, and then have a conversation about. They are perceptible objects, ultimately not reducible to the interpretive whims of persons.

But does Muller's account make sense of our individual experience using words? When he says thought is language minus sound, he seems to suggest that the language we use does our thinking for us. And, no doubt, there is something to this. If people spend enough time together, talking to each other, they end up thinking more or less on the same lines. We tend to listen to and read things that confirm our sense of the world around us.

I have to wonder, though. I personally have had a not infrequent experience of lacking the right words to express my intention. The words don't correspond quite right to an objective states of affairs, and so I find myself unable to communicate my meaning. The result is that the logoi in my head don't always seem to match up with the logoi in someone else head. The only thing to be done is root around in my head for better words or better ways of stringing words together.

The Ancient Greek idea of logos is able to make sense of this situation. It locates intelligence both inside and outside a person's head, but doesn't require that the correspondence between them be completely transparent. About Muller's conception of language, I am not so sure. If thought is really just language minus sound, if the correspondence really is transparent, one has to wonder who is doing the thinking.

Wednesday, April 03, 2013

Dying with Iain (M.) Banks

The Scottish science fiction writer Iain Banks announced to the world today that he probably only has a few months left to live. Diagnosed with gall bladder cancer a couple of weeks ago, Banks has put his feverish rate of literary output on hold indefinitely, asked his partner of many years if she would do him 'the honour of becoming my widow', and plans to spend the remainder of his days visiting with family, friends, and locations that hold personal meaning. He is not yet decided whether he will pursue chemotherapy treatment to extend briefly what time remains to him.

Banks breathed new life into the high art of hard science fiction, which had known such masters as Isaac Asimov and Arthur C. Clarke, with a series of Culture novels. The better examplars of the genre are defined by a certain cosmic gimmick, setting the stage on which the plot line unfolds. For Asimov's Foundation Trilogy, for example, the discipline of psychohistory, developed by the patriarchal character Hari Seldon, promised to unlock the key to social development. Seldon predicted the decline of the Galactic Empire, and laid foundations for a much more durable successor. The predictive failure of psychohistory to account for an enigmatic figure known as the Mule, a sort of galactic Napoleon, drives the plots of the second and third parts of the trilogy.

The cosmic gimmick driving Bank's Culture novels does not allow for quite so much human participation. The Culture novels form a collection of more or less disconnected narratives set in the same universe. The Culture is a vast civilization governed over by massive artificial intelligences, who keep a human population sprawling across planets, airspheres orbital platforms, shellworlds, and ships spread across a large portion of several galaxies (if my memory serves me correctly). The narratives play out in the vast distance between the finite human mind and, what are for all intents and purposes, practically infinite Minds. Banks has a gift for imagining vast intelligences whose experience of space and time is utterly dissimilar from human perception.

The Culture is a 'post-scarcity' society, in which no citizen lacks for their basic needs. Surrendering the government of human society to the Minds, removing human avarice, error, and whim from the political equation, meant that material equilibrium in society was now possible. Money and personal possessions no longer exist, though material prosperity still allows for the cultivation of privacy. There is a moral seriousness to Bank's storytelling. He doesn't shy away from explore the fiber of a society that has grown fat, complacent, playfully irresponsible, and whose personal bonds are reinforced by an artificial structure. At the same time, the Culture narratives seems to play out like an internal monologue in Banks own head as he explores the logic of his atheist convictions. Many of his characters regard their own existence with a sort of bemused shrug one can well imagine their author shares. A touch of the great stoic Scotsman David Hume exists in Banks--and there would be more, if he weren't so damned Hegelian.

I started reading Banks' work about six years ago, around the same I picked up George R.R. Martin's Game of Thrones series. It was his ability to expound on philosophical themes in novel form that prompted me to read as much as his work as I had the time the to spare. Like so many other science fiction authors, Banks rethinks divine transcendence in terms of a future state of affairs, rather than an eternal present, which is the same everywhere, past, present, and future. Divinity, though still exceedingly powerful, is placed under spatio-temporal constraints. In the case of Bank's Minds, they emerge from the depths of human creativity, achieve independent sentience, and are let loose to care for their creators. Granted this only seems like a different form of servitude; but the Minds, particularly the ship-based Minds, seem to take it all in stride and dry humour.

Science fiction writers are usually at their best mocking the old ideas of God and domesticating it to their purposes. I say usually because I am not sure that someone like Robert J. Sawyer actually knows how to do anything more than preach to an atheist choir. Bank's literary engagements succeed, to my mind, on account of his willingness to acknowledge that dethroning the old gods does not eliminate the existential questions for which the old gods provided answers.

Not wanting to sound insensitive, I will be curious to watch the moment when the pen which Banks uses to write this final chapter in his life finally falls from his hands and is taken up by an increasingly vocal atheist elite. Banks' life is likely to be eulogized, his self-sufficient hold on existence, his lust and zest for life, held up as an example for atheists everywhere, much like late Christopher Hitchens' life has been celebrated.

Hagiography is a double-edged sword. When you extol the virtues of mere mortals, they usually end up appearing more mortal and less virtuous. It has very little to do with the person being eulogized, in any case, and more about what s/he has meant or continues to mean for we who live on. But perhaps it best not to speed Banks along his way just yet by thinking on what might be. Some time still remains. And the publication date for one final book has been moved up.