Thursday, February 28, 2013

Thinking about This, Here and Now, with G.W.F. Hegel

As I wrote elsewhere, Idealism is the name given to the counter-intuitive thought that every thing is a thought. Usually we go about our days thinking that most of the things we think about are not actually in our heads. Find something ready-to-hand to think about: a desk, a USB key, a word on a page or screen, anything. Are they in your head? Though the thought of the thing may be in your head, your average, everyday self says, the thing itself is not. Which is precisely where the idealist hopes to trip you up. The distinction between thought about the thing and the thing itself is one that you draw. In what sense is the distinction that you draw not in your head? You drew it. And if you drew the distinction, does that not mean by extension the thing itself is also in your head? At the very least, it would seem to imply there is nothing outside of your head, that your head is all there is.

Most of the time, Idealism looks like a fun little game intellectuals play with themselves. All these clever questions, circular arguments, and impossible problems appear nothing more than a frivolous form of auto-intellectual-eroticism. Mere verbal swordplay. Until the game becomes deadly serious, that is, and  other people start drawing the consequences some intellectual has spun out from his intellectual game for the form of community, for the nature of political authority, for answers to questions about good and evil. When that happens, Idealists suddenly looks serious, and for someone like myself, it becomes worth thinking through.

There are very few places in G.W.F. Hegel's influential Phenomenology of Spirit where the text speaks directly to a recognizably human situation. Most of it is shrouded in impenetrably abstract, self-referential prose, of secrets wrapped in enigmas hidden in mysteries, which is standard idealistic Hegelian fare.

Hegel's discussion of the 'master-slave dialectic' is one of those few places where a recognizably human situations comes to the fore. The master steals the labour of his slave, thus 'alienating' the slave from the work of their own hands; the master possesses authority without contributing to its upkeep, while the slave works to maintains authority without possessing the freedom to exercise it. If left on its own long enough, this situation will create such tensions the slave is likely to rebel against the master. These are simple observations, governed by an abiding sense of what constitutes equitable relations between persons, bolstered by a conceptually sophisticated account of the nature of existence as such.

Hegel's analysis of what he calls 'the dialectic of sense-certainty' yields another such recognizably human situation. The discussion also occupies an important place in the first full chapter in the Phenomenology, following a lengthy Preface and a shorter Introduction. The analysis prepares the ground everything else that follows, including the discussion of the master-slave dialectic, true individuality, cultural, morality, religion, etc.

The nature of the dialectic of sense-certainty Hegel makes very easy for readers to grasp. All he asks you to do is to think of something. It helps if you think about something that you can see and touch, to anchor this exercise in something solid and tangible. The act of thinking about something will always have two components: an 'I' which thinks, and a 'this' which is thought about. I am thinking, for example, about this computer monitor.

Now, Hegel observes, knowledge derived from our bodily senses seems the most certain sort of knowledge that we possess. Hence: sense-certainty. The computer monitor in front of me does seem to be what I think of as a computer monitor. But is this true?

When I fix my attention to a 'this', the 'this' is situated both 'here' and 'now'; that is, my attention is fixed to an object occupying a particular spatio-temporal location. Try it, if you don't believe me. When you think about the computer monitor, you will indeed find that it occupies a certain location in space at a specific time. It will also do so when you don't think about it, but I am trying not to confuse things more than necessary by adding layers of complexity.

A number of things can happen to this computer monitor, here and now. Your attention can wander, in which case it's no longer the monitor occupying your attention, or not-this, but that. The location of the monitor might change, in which case it's not-here, but there. The time will also continue to move onwards, which means it's not-now, but then.

Your attention, or I, remains fixed on the this, here and now, but the content of the this, here and now, changes. The this is no longer the same this you were thinking about. In more Hegelian nomenclature, you asserted a this, which turned out to be not-this. The continual conscious movement from a this to not-this is the dialectic of sense certainty. The dialectic is problematic, in the same way the master-slave dialectic was troublesome. There is no obvious order to the movement. Nothing is certain. The dialectic needs resolution.

Still, we are prone to ask, Who cares? My attention shifted; things moved; at least time moved onward. These are banal observations. Well yes, but you probably aren't thinking like an idealist. The movement from a this to a that, or a this to a not-this, is the first negation. Something concrete, substantial, something particular, like a computer monitor, has been negated. The story is not over, though. Your attention will wander, time will move onward, and first negation will itself be negated--a negation of the negation.

The negation of the negation is not so easy to understand as the dialectic of sense-certainty. In a manner of speaking, Hegel is asking you to suspend your belief for a second and follow him. The thing of which you were originally conscious, e.g. the computer monitor, may not have been the thing you were conscious of a moment later, e.g. your phone. Fine. You could keep on negating things with other things as your conscious awareness wandered. After negating the computer with the phone, however, Hegel asks you to abstract yourself from thinking about all particular objects and negate the phone with the this. Because, in fact, the this was always present, either as this computer monitor or this phone, and finally, with the negation of the negation, simply as an abstract universal this.

The basic structure of conscious thought, Hegel says, includes an I which thinks about this, here and now. These are universals: an I containing all Is, a this containing all particular thises (like computer monitors and phones), and a here and now containing all particular heres and nows. You are potentially, though not actually, omniscient, omniscient, and omnipresent.

If you can make the conceptual leap from thinking about particular things to thinking about abstract universals, and if you keep in mind that you are making the conceptual leap, that all particulars and universals are contained within asingle movement of your own thought, then you too can think the thoughts of God. Or can you?

Good question. Saying that the conceptual leap can be made is sort of like saying its possible to resolve the master-slave dialectic in reality.

Wednesday, February 27, 2013

Defining Religion, From the Inside Out

Every time I begin another section of RELG 207: Introduction to World Religions, I come back to the problem of how to communicate what exactly is at stake for the adherents of various religious traditions. It's very easy to teach Buddhism and Hinduism as the curious ideas in which other people believe, which you need to know to make sense of an increasingly globalized world. That is to say, it's very easy to fall into the trap of teaching a class on religious beliefs as if they were objects capable of being described, when in fact religious beliefs are nothing of the sort. Description may be possible; but objects they are not.

Certainly religious texts and religious practices admit objective description, but religious beliefs, attested to by texts and practices, are more along the lines of a perspective on objects than they are objects themselves. Not being objects themselves, it is incumbent upon outsiders to actually do the hard work of trying to understand what another perspective of objects might say about the objects of our common human experience.

In other words, you have to admit a very human limitation. If you have gotten as far as admitting the very human reality of the difference between a perspective on objects and the objects themselves, you might as well go the rest of the way and admit that truth of a religious belief fall far short of being something manifestly obvious to everyone possessed of sound mind and body. Nothing, no truth, will ever be self-evident to every person. No matter how hard you try, you will never close the distance between what you think about things and the things themselves. The distance is an infinite one; there will always remain a gap. Only a being like God ever could close it; and last I checked, not of us are able to fill those shoes.

Once this very human limitation is admitted, we can start over by turning the world inside out. The subjective definition of religion, or religious belief, regards, not a discrete set of objects, but a perspective on absolutely everything in existence.

The easiest place to begin an insider's account of religious belief is simply with the the human inclination to think about things, anything, like a computer monitor in front of you, or a desk, or the chair you are sitting on. Now extend the range of your conscious awareness and 'abstract' yourself from your immediate surroundings. These things are most likely contained within a room, which has walls and a door. The room is most likely found in a building containing a good number of other rooms. The building will sit alongside other buildings, within the reasonably well-defined limits of a town or city. Mentally zoom out until you are holding the rather vaguely defined contours of our planet in mind. It's entirely possible to go further, of course, and the process can be repeated wherever you find yourself, though my point should be made.

The point is this: tied into the difference between thinking about things and the things themselves, there is a body always situated here and now, and a mental capacity expanded to embrace an ever wider vista. Between the objective and the subjective, the external and internal forms of perception, there is a radical disjunction. The two are stuck together, they fit into each other, inform each other, and so on. The nearer the scale of your consideration to a bodily human scale of existence, the better defined the mental image becomes. The human scale itself is directly apprehended through the bodily senses, especially of hearing, sight, and touch. The further from a human scale, the more blurry the mental image becomes. What they two sides never do is dissolve into each other. The difference between them is one you will carry around for the rest of your life. In a certain sense, you are the difference: the relation between an 'inner' soul and 'outer' body, relating to itself.

I won't labour this longer than necessary. There's nothing necessary about them--nothing logically compelling. Everyone can distinguish in their own mind between thinking about things and the things themselves, but no one must do so.

If you do think through the distinction, however, the duality of the embodied human being is waiting, even if only indirectly, for you to discover almost everywhere you go among the ancient religions. The human being shares the world with innumerable other beings; but the human being, the philosophers, prophets, and sages of old recognized, also knew they share a world with innumerable other beings.

And whatever name given--call it God, Brahman, Nirvana, Heaven, the Tao, etc.--to that something that explains why there is a world full of things in the first place, it is more like the mysterious depths of conscious thought than it is like the perceptible things human beings think about.

The Tao that can be spoken is not the eternal Tao
The name that can be named is not the eternal name
The nameless is the origin of Heaven and Earth
The named is the mother of myriad things (Tao Te Ching, ch. 1)
There was neither non-existence nor existence then; there was neither the realm of space nor the sky which is beyond. What stirred? Where? In whose protection? Was there water, bottomlessly deep? (Rg Veda 10.129.1)
“For my thoughts are not your thoughts,
 neither are your ways my ways,”
declares the Lord.
“As the heavens are higher than the earth,
so are my ways higher than your ways
and my thoughts than your thoughts." (Isaiah 55: 8-9)

Tuesday, February 26, 2013

What's Wrong with Proceeding to Infinity?

What's so wrong with the idea of proceeding to infinity? A few days ago, I wrote about St. Thomas Aquinas' argument for the existence of God from the motion of objects. My focus was on the changed ways in which the argument is understood. We think we know what Aquinas is talking about when he talks about motion, but in our everyday way of thinking about the world we tend to understand things in Newtonian terms. The most obvious example is how we automatically conceptualize motion in terms of distance/time, e.g. km/h or mph. Aquinas' much broader understanding of motion merely requires a change of state (growth, decomposition, etc.), however, and is not solely limited to a change of position in space over a period of time.

I avoided saying much about the key part of the argument, which was the insistence that causal sequences cannot proceed to infinity. To refresh our memories, the argument goes like this:
1) Since everything in motion is moved by some other thing, and
2) since movers and things moved cannot proceed to infinity,
3) therefore, we must posit the existence of an Unmoved Mover, which we call God.
The entire argument rests on Step 2), as the first step is relatively uncontroversial. Things don't pop in and out of existence. Things don't move for no reason whatsoever. Everyone should be on board so far. But the Step 2) is really troublesome. What exactly is Aquinas claiming?

There are two immediate possibilities. The first says, An infinite regression is not a plausible explanation for anything, in the same way that to say the world rests on the back of a elephant, which stands on the back of a turtle, and thereafter its turtles all the way down, is not an good explanation for what keeps the world fixed in place. The second says, An actual infinite something (mass, volume, causal series, etc.), whatever it might be and however that might be measured, cannot exist, in a manner similar to our universe, which is for all intents and purposes practically infinite, from our limited human vantage, but shows physical evidence of having finite limits, in particular, a temporal beginning that set everything motion. (For good measure, let's give the beginning a name and call it the Big Bang, after the well known comedy show The Big Bang Theory, which made geekdom and nerdom cool.)

Deciding which is the better interpretive option is not easy. When Aquinas says, 'movers and things moved cannot proceed to infinity', it appears to be an instance of the second possibility. The plain meaning of the phrase points to an actual infinity. However, as soon as you ask the question, why 'movers and things moved cannot proceed to infinity', you must go looking for reasons. The sort of reasons that lend themselves to answering the question are more likely instances of the first possibility. An infinite causal series cannot exist because an infinite regression is not a plausible explanation for anything.

Now, a possible argument against the existence of an actual infinite causal series of movers and things moved offered by the 6th century, Christian Neo-Platonist critic of Aristotle, John Philoponnus, goes as follows. If the causal series is infinite, then there has been an infinite number of movers and things moved up to the present moment and will be a second infinite number of movers and things moved after the present moment. That means means there will exist two infinities, which is contradictory, and so also false, because there can only ever be one infinity

It's unlikely that Aquinas, however, was so numerically inclined. Numbers had been the obsession of Platonists (1+2+3=6), not Aristotelian's like Aquinas, who were more interested in thinking through the natural forms of causal relations (A follows B follows C, which means A follows C). It's only with Isaac Newton these two modes of explanation get brought together into a single system of The Mathematical Principles of Natural Philosophy (1687), rendering the systematic relations between quantities like velocity (m/s), acceleration (m/s2), and force (kg*m/s2) intelligible.

What then might Aquinas be on about when he says 'movers and things moved cannot proceed to infinity'? Aquinas assumes that the human mind is attuned to the natural order of things, such that if it is mentally true, it is also true of extra-mental things, and vice versa. Via their intellect, human beings participate, in some way, in the intelligence which calls the world into existence.

Now it is true that the human mind cannot process an infinite causal series. If you don't believe me, go ahead and try it for yourself. Chances are you mentally charted out an A followed by a B, B followed by C, C followed by D, and D followed by...the infinite series itself. Note carefully that your attention shifted from individual steps in an infinite causal series to the causal series itself because it wasn't possible for you to chart out every single step in the series. Because if, in fact, you did try, you are still trying, and will still be trying, and never succeeding to comprehend an infinite series in thought, until the day you die. Might as well stop while your ahead.

So if it's true, as Aquinas holds, the human intellect participates in the intelligence that calls the world into existence, and if it's true that what applies for one also applies for the other, it will also be true that 'movers and things moved cannot proceed to infinity'.

Sunday, February 24, 2013

Historical Inaccuracy

Ever since the 17th century, European scholars have been fighting over the nature of historical accuracy. One of the first battle lines in the erudite debate was drawn across the pages of Thucydides' History of the Peloponnesian War, the classic account of a 21-year long conflict between Athens and Sparta in the 5th century B.C.E

To this day, Thucydides is held up as a model for careful historians; unlike his elder contemporary Herodotus, who had a flair for passing on the oddest cultural episodes as fact. But Thucydides' own work did not completely escape criticism. In the 17th century, lengthy speeches, written with obvious rhetorical concern, supposedly delivered by famous Athenian orators, became the subject of controversy.

Someone pointed out how implausible it was to assume the speeches could be a word for word transcriptions of actual speeches. It was unlikely Thucydides could have had more than a second or third hand accounts, if it indeed the speeches occurred at all. And their final form was far too polished.

Scholars made a concerted effort to think beyond the surface of the page. If the funeral oration Thucydides placed in Pericles' mouth, which praised Athenian democracy and those who died in its defense, for example, was only a simulacrum of what Pericles should have said, in what sense is it true? Indeed: is there any sense in which a manufactured speech can be true?

Until the 17th century, scholars could more or less assume that where there was an ancient text of proven authority, there also could be found truth. Lorenzo Valla's exposure of the spurious origins of The Donation of Constantine (1440), of course, may be thought to supply an exception to the rule. But even he did not challenge the basic equation of a text's antiquity with its veracity.

When scholars began to ask, however, whether ancient historians could have witnessed the things they claimed to record, a wedge was inserted between the truth of a text and the text itself. Or, you might say, instead of one, there were two sorts of truth: historical truths, or facts of the matter, could now be distinguished from rhetorical truths, their value to readers. Though Pericles might not himself have given the speech Thucydides placed in his mouth, for example, his speech still had value because it said about democracy

This double theory of truth, distinguishing historical fact from rhetorical value, today follows us wherever we ago. Even if you aren't quite able to put your finger on it, you can sense its duplicity. It's why you roll your eyes when news anchor put on their serious faces to discuss whether historical inaccuracies in such Oscar-nominated films as Argo or Lincoln detract from the film itself.

The historical accuracy of feature films doesn't matter at all, though it's useful to be reminded of the epistemological we bear. In intellectual discussion, questions about historical accuracy easily deflect from questions about the value of something, while our particular attachments and beliefs can sometimes prompt the willful misrepresentation of historical information.

The two sorts of judgments bled into each other, often in arbitrary ways. The grim reality of the Terror in the aftermath of the French Revolution is taken by many as proof positive of the depravity of the ideas the Revolution represents. The ideas may very well be depraved; I have no reason to quibble over that judgment. Then again they may not. It's just that historical claims can never be the logically necessary basis of their refutation--absolutely never, never in a million years. The way things were, the way they are, or even the way things will be, will not square with the way things ought to be. The historian-turned-philosopher, David Hume, made the point emphatically: an ought can never be extracted from an is.

Welcome to the disenchanted universe. Historical accuracy wont't get you closer to the TRUTH, because their are (at least) two sorts of truth for which to account. Historical inaccuracy won't take you further away from the TRUTH, again, because there are (at least) two. Please make yourself at home. You've been here long enough that squatters' rights apply.

Saturday, February 23, 2013

The Trouble with the Common Good

The trouble with talk about the common good is that no one can quite agree what are the goods we share in common. Two possible routes out of the dilemma present themselves: either abandon talk of the Common Good and search for a set of lowest common denominator goods; or start telling people what the common good is and why they need to listen. The former option we might call libertarianism, which holds individual freedoms ought to be cultivated, not curtailed, by government. And the latter option can be called paternalism, which holds government can and should use its authority to guide citizens who don't otherwise know any better.

So if we can't agree, we either find ways around having to agree, or I tell you what you need to know. Right?  Not quite. These are obvious logical alternatives. If we agree that we don't agree what are those goods we share in common, these are two possible alternatives if our aim remains to articulate a common vision for how we negotiate what are our goods. The idea of common has merely been displaced to another 'level' of analysis. The gnostic idol of conceptual elegance, pristine logical form, prevents us from actually interrogating what we mean by common--whether that be common goods or a common vision for how we negotiate our goods.

Recall the saying about how we are liberals (or communists) in our youth and conservatives (or capitalists) in our old age. The same can also be said of libertarian (especially in our teenage years and early 20s) and paternalists (in our house-holding years and old age). The editor of Comment Magazine, James K.A. Smith, appears to have undergone a transformation of this sort ("You'll Thank Me Later": Paternalism and the Common Good).

Smith points out that paternalism is the new bigotry in our 'age of authenticity'. Though he might have agreed the platitude in the past, now he wants to make an intellectual investment in the language of the common good and sees value in teleological language that explicitly identifies that good.
But I’m getting over it. Quite simply, I don’t think you can sign up for pursuing “the common good” and hope to avoid at least some implicit commitment to paternalism—some sense that one knows what is good for others.
From a certain perspective, Smith's mind appears to have grown, if not exactly lazy, then impatient and tired of waiting; though, from another perspective, his thinking has matured. Which it is remains to be seen.

Significantly, Smith never once interrogates the idea of common. He uses a lot of impressive language about teleological orientations and substantive conceptions. His analysis, however, never gets beyond old hang-ups with the word paternalism.

Failing to interrogate the word common, Smith draws a number of conclusions that fall flat. 'Quite simply, people make a lot of mistakes because they don’t think about their own well-being.' That's true, but obvious. 'So people’s well-being is actually benefited by “nudges” from direct us toward decisions that are for our own good.' That's also obvious. Democratically elected governments are engaged in this sort of directive behaviour all the time. So what?

Smith concludes, 'I just wonder if we have more responsibility to our neighbours—a responsibility to pursue policy that nudges them toward the Good they may not know.' And I have to wonder what is being suggested. Who is this 'we' of which he speaks? A church community reaching out into the wider community in which it is situated? Okay. Not sure a presumptive paternalism will win you converts. A civil community? Okay. Not sure paternalism will win you much respect for your efforts. 'We' in what capacity? 'We' through what means?

The commons aren't common if you enter carrying stick with a carrot dangling from its end. So far as the person dangling the carrot is concerned, there's nothing common about the commons, because they have a priori claimed to be the exception. If you have to talk down to others, you can no longer talk with them. And sticks with carrots very easily become weapons.

Smith cites Aristotle and Aquinas to invest his discourse on why the common good must be dictated to invest it with respectability. I suppose, if he has to cite someone, they are better than a number of other names that come to mind. As much as he might want to, though, he cannot cite the paradoxical paternalism of the Prophet Isaiah, who describes God as a being whose ways are higher than human ways, whose thoughts are higher than human thoughts (Isaiah 55:9), but who nevertheless says, 'Come, let us reason together' (Isaiah 1:18). Nor can he cite the paradoxical paternalism of the Gospel writers, of a God above who speaks to humanity as a man among his fellow human beings, and who shares in the common fate of all: death and taxes. And he cannot say with the Apostle Paul, 'I do not understand what I do. For what I want to do I do not do, but what I hate I do,' (Romans 7:15) because he knows what to do, and that's all there's to it.

At least, it is if you do not think very hard about what is meant by common.

Friday, February 22, 2013

Justin Trudeau, Unplugged

In the back of the minds of many citizens of democracies, I suspect, there is the ancient memory of a democratic Eden. It's not an actual memory, but an archetypal location from a time out of mind in which governments were truly participatory. Citizens will have a slightly different idea of what a truly participatory government looks like, based on their own preferred level of political engagement. The important point is that everyone carries around in their head an idea of how things are supposed to be. A lens through which contemporary political happenings gets focused, it's also the place where we want to be, and maybe even the place we are trying to get back to.

I had a reminder that I too have one of these ideal democratic polities floating around in my own head a couple of nights ago. Sabrina and I headed the 'West Island', to a church near the end of Sherbrooke Street. We went to see the Liberal leadership hopeful, Justin Trudeau, speak to a gathering of the Liberal faithful.

Now I claim no expertise in things political, nor any overpowering desire to throw myself into the political fray. At the very least, though, I am interested in observing, and will consent to participate in some small way. Still, most of what goes on around campaign events I find unnerving. Everything is scripted down to the last five second block of time. The candidate's smile is well rehearsed. Not a hair is out of place. Everyone cheers like they are at a rock concert or a soccer game. Stuff seems to happen without too much explicit interference by handlers. People know the roles so well that they must have checked their personal idiosyncrasies at the door. They lose themselves in the mass consciousness of the crowd, which is directed towards the adulation of the candidate, who is to the crowd like a soul animating a body.

The candidate who is on a first name basis with everyone did not walk into the hall. The wind swept him in. Striding boldly down the center aisle towards the front, Justin stopped to shake a hand. Turning to place a hand of a shoulder, he smiled for a picture, before ascending the stage. The entrance was so impressive, I have to believe it was rehearsed. Just how easily Justin took control of the room took me a bit by surprise. I noted my own emotional response to his entrance was favourable.

Then, after a few introductions, came the homily. It seems somewhere along the way we forgot what democracy was about. The Liberal Party paid dearly for its sins in the polls over the last two decades. Now the Conservatives and the NDP run Her Majesty's government and loyal opposition like military drill sergeants. Policy is dictated to the nation; the idea of a government as the servant of the people has been forgotten. We the Liberal Party membership, Justin says, will be the basis of a revived Liberal Party. The effort must be a grassroots effort, with strong and vibrant riding associations. Candidates won't be chosen by the Party leadership; they will be chosen by the local membership. The entire homily lasts, oh, probably 20 minutes. A warm round of applause was its conclusion.

Great speech. Smiles all around. That distant memory of a true participatory democracy, which isn't strictly a memory, I carry around in my head came bubbling up to the surface.

My distant memory is of a place with a responsive democratic government. Politicians who listen to their constituents, speak deferentially, confess the troubles of leadership; these sorts of people make their home there. They talk on behalf of the their constituency only if their office allows them to do so, and only in the capacity of their office. Not like American politicians and pundits, who can't resist speaking on behalf of a silent citizenry. Nor like the puffy Senator Mike Duffy, who informs me he knows Canadians know he has never abused the privileges of the Upper House, which Canadians could not know with anything more than the naive trust placed in a slippery salesman. My distant memory looked a little like the vision for a new political order Justin spoke about; it was in stark contrast to the carefully managed ethos of a political campaign.

People will be paid good money to manipulate audiences. Demographic studies are the new way of sounding out the mood of the populace. Polling agencies aid in the development talking points with specific targets in mind. If Justin does his job right, and if he is to have any chance of success, he will have to make use of all the latest marketing and social media strategies. Politicians no longer shout at crowds without first having tapped every possible data source that can tell them what the crowd wants to hear.

This delicately difficult act of balancing form with content, structure with personality, is native to exercise of political authority. The democratic character of our polity entails personal representation. But the size of our polity makes self-representation impossible, not merely impractical, but impossible. There will be an unavoidable incongruity between the form and content of political authority. There will also be ample oppourtunity for persons in authority to err on one side or the other. Perhaps what bothers me about the campaign circuit is that the candidate is entirely personable in an entirely structured way.

Sabrina and I were among the last to have our picture taken with Justin. I made a point of apologizing for asking him to smile for a camera one more time. He did not look tired, though I can well imagine he felt it.

Wednesday, February 20, 2013

Talking about Talking about World Peace

'Could a new alphabet promote world peace?' On the surface, albeit implied in a rhetorical question, it seems like a plausible suggestion. Of course it could. But the real question is, Will it promote world peace? And the answer to that is obvious: No, it will not.

It all boils down to a question of agency. Languages don't promote anything. The English language encourages a sort of forgetfulness, which we have even seen fit to give a pair of technical names: metonymy and synecdoche. The former means to identify something by referring to one of its parts; the latter means to identify a part of something by referring to the whole. Rhetorical devices are no excuse, however, for intellectual laziness.

If you ask the question, Could a new phonetic alphabet promote world peace? what you are actually asking about is whether people will be able to use this new alphabet to promote world peace. Alphabets have no existence apart form language users. Alphabets have no agency apart from language users, and in the strict sense of the term, they have no agency at all. The part only functions as part of a whole. So the answer to the question is: No, absolutely not. Only language users, living breathing human beings, can promote world peace.

My criticism may appear rather paltry, even inconsequentially nit-picky. Though I believe it an important point to make. Contemporary post-structuralist theorists tend to think of language as being determinative to how people understand their world. Change the language, this line of thinking goes, and you change how a person understands their world.

Certainly language shapes how we understand their world; but it's not a determinative one-way street beginning with language and ending with how people think. Language users like ourselves shape language, just as much as our thought patterns are shaped by language. The relation between language and language user is one of dependence. No part without a whole; nor any whole without parts.

Like the 20th century attempt to create a universal language named Esperanto, these new attempts will get us no closer to the goal of world peace. They may, in fact, take us further away. If we suppose a common language will ameliorate violent behavior, like bread can ameliorate hunger, we will have misunderstood the nature of violence.

For the nature of violence also boils down to a question of agency, mine and yours, ours and theirs. People may communicate with and understand each other perfectly. That's not stopping them from also wishing the most unspeakable horrors descend upon the other person.
And world peace? "I can certainly see the argument for saying that a shared language can prevent conflict. However, shared language can dupe us into thinking we share other things - values, beliefs, goals - when in many cases we don't. Does it minimize differences, or merely mask them?

"In any case I'm not sure that ease of communication guarantees harmonious communication."
Though it may seem rather immodest of me to say so, I am quite sure it does not.

Tuesday, February 19, 2013

The Canadian Office of Religious Freedom

Nothing raises Canadian political ire more than a politician's reference to religion, any religion, but especially religion in politics. Not that everyone is going to condemn every reference. Certainly not. But you can be sure that every reference will be polarizing. Some will approve, and some won't. Those who don't approve will cite precedents for keeping religion out of politics. Those who do approve this time around will be citing the same precedents next time around.

Political conservatives and liberals both have their own way of doing religion, which they see as unproblematic. Even vocal atheists, who tend to be politically liberal, do religion. Some references will slip by as being harmless and not worthy of response. Other references almost seem to require a sarcastic response. Everyone is reaching for a moral high ground; but in the case of the violently derisive discussion about religion, it is unlikely anyone will ever reach it.

So naturally Twitter experienced a minor eruption as Prime Minister Stephen Harper announced the name of an ambassador to head Canada's new Office of Religious Freedom. Commentary came mainly from the liberal end of the political spectrum. Comments were incisive and pithy. With only 140 characters to spend, they had to be. Here are a few of the choicer contributions:
CC ‏@canadiancynic

In a shocking turn of events, the new head of the Office of Religious Freedom is a Christian, …
Herbert Pimlott ‏@Herbert_Pimlott

Proof #Harper has a sense of humour?! Setup Office 4 Religious Freedom & gag scientists, cut dissenting NGOs #cdnpoli
collin grasley ‏@mode23

Theological Action Plan - harper opens office of c̶h̶r̶i̶s̶t̶i̶a̶n̶  religious freedom. #cdnpoli #FIVEMILLIONTAXDOLLARS
Secular types wonder why Harper has cut government funding everywhere else. Federal Liberals and NDP wonder, in a very predictable turn of events, why the new Ambassador is Christian. Atheists wonder why there is not an Office of Freedom From Religion.

As a religious studies student, and a Christian with an interest in the study of the so-called world religions and natural religion, it immediately strikes me that the cast of the objections are predictably Christian objections. Granted, the objectors may not individually profess the Christian faith. They may even despise everything about Christians and what they believe. But as long as they appeal to a manifest obviousness that actual religious confession has no place in the public square, they whole-heartedly identify with a basic Christian dogma about the separation of the separation of the powers spiritual and temporal (or civil). Render unto Caesar, etc. etc. The joke is on them.

As an idea, at least, an Office of Religious Freedom actually makes a fair amount of sense in a post-Soviet, post 9/11, post-Arab Spring, post-, post- world. The End of History came, pace Francis Fukuyama, but the Last Man turned out to be some variety of cleric: an imam, pastor, or priest. Critics of the Office risk exposing themselves for the cultural dinosaurs that anyone who still has in their head that a secular utopia is just around the corner must inevitably be.

Critics of the choice of a young Roman Catholic prof for the position of Ambassador risk exposing their ignorance about the difficulty of staffing the Office. The critics themselves demonstrate that secular types won't touch the position with a 50-foot pole, which means you are going to have to look for a committed believer, which means...some particular faith. We are told that two other people turned the position down. Which means Dr. Andrew Bennett not the first choice for what will inevitably be a controversial and thankless job. Whether he is a bad choice, however, will not be determined until  we have gone a little ways down this road.

The latest Twitter post:
Iain Harnish ‏@IainHarnish

Office of Religious Freedom How did a once forward-looking country come to this.
Instead of this sort of armchair quarterbacking, it's better to accept the Office of Religious Freedom as a fiat accompli. Our energies, if we care at all, should be directed towards what sort of issues exactly the office is going to champion at home and abroad. The one danger I see is that the Office very quickly becomes a lobby group for the protection of Christian missionaries abroad. If so, it will champion a very narrow conception of religious freedom, once comes dangerously close to equating religious freedom with the Christian freedom to proselytize. (Other religions proselytize; but only Christianity makes proselytization part of its raison d'etre.) That will undermine perceptions of the Office in Canada at large. It will also undermine, I would argue, genuinely Christian interests, which, I would argue, in the public sphere, ought to be directed towards the dignity of humanity. Beyond that the powers of the state are a honey-trap, which ought to be held at arms length.

And rather than mock the new Ambassador of Religious Freedom, I think it more appropriate to extend the poor bastard my sympathies. We are watching you. And by we, I mean all of us Canadians. And we don't agree on very much at all.

A Warrant for the Arrest of a Pope?

An old story is being brought forward as the real reason why Pope Benedict XVI has resigned the papacy. Namely: he is looking for immunity against prosecution for 'crimes against humanity' for his role in covering up rampant sexual abuse of children perpetrated by Catholic clergy. None of this makes much sense, not the least of which is immunity being a reason for abdicating. If immunity was the issue, not abdicating would be the likely solution.

Richard Dawkins tweeted the story sometime last night: 'The Pope seeks immunity: The end of the Vatican could be near'. The title doesn't actually reflect the content. From what, or whom, is the pope seeking immunity? The article is not clear. It focuses in on a single word ('defenseless'), interpreting it as broadly as possible. Defenseless against what? Prying eyes? Over-zealous prosecutors? Well, no one in particular, seems to be a better answer than the one suggested. A man of Benedict XVI's stature, in a position that is an inevitable lightning rod for controversy, will need security.  The measure is entirely precautionary.

Reuters indicated last Friday the ICC has declined to pursue charges against the pope ('Pope will have security, immunity by remaining in the Vatican'). Again the title of the article suggests more than the article offers. The steps taken on the pope's behalf can, and no doubt should, be interpreted as nothing more than steps to ensure that an elderly statesman is allowed to retire in peace.

Dawkins and a few conspiratorially-minded friends (for example, here and here), nevertheless, persist in their conviction that a former pope should need security at all is evidence of an admission of guilt. The only new feature to the story, however, is a sensational 'news exclusive' put forward by The International Tribunal into the Crimes of Church and State. Never heard of it? Neither had I. It appears to be the work of one man, Kevin Annett--which means it doesn't (yet) qualify as an international tribunal.

There is, of course, a much longer story that can be told. The Catholic Church is under criticism and pressure from the wider world to bring its internal policies regarding criminal behavior into line with criminal codes in the various jurisdictions within which it resides. Ideally, the Church would report any offender to the public authorities. So if a priest was found abusing a child, that priest would be hustled off to a court appearance, rather than another diocese.

During the Late Medieval Period, very similar tensions arose between the Church and governments native to the places it maintained residence. The Church was the nearest thing to an international body at the time. It maintained its own government and court system. Legal precedence and continuing influence allowed it to keep public lawmakers at bay. So if a priest disrupted the peace in any way, the Church claimed jurisdiction in the matter. The priest usually got off with a light sentence, and was hustled off to some other corner of Germany or Italy or France.

Late Medieval political theologians like Marsilius of Padua worked to regularize questions of legal jurisdiction. Rather than having two communities interacting with each other in their day-to-day affairs, but subjecting themselves to separate legal regimes and systems of government, Marsilius argued that lines between the two had to be drawn in terms of ends appropriate ends of government. Church government was government over the souls of people. Public government was government over the bodies of people, i.e. their mortal affairs like property, commerce, judgment of criminal behavior, and so on.

Should a priest commit a public crime, Marsilius thought the priest should also stand in public court. No special preference for judgment in ecclesiastical court ought to be accorded the clergy. That makes a good deal of sense, Marsilius believed, and accords well with Christ's command to 'Render unto Caesar what is Caesar, and to God what is the God's.'

The problem is that Marsilius' writings presage the eventual division of Christendom into Catholic and Protestant parts, which in turn presages the marginalization of the Christian faith in the public life of nations. Marisulius' writing resurface, in fact, just as Henry VIII prepares to nationalize English monasteries and declare himself head of an English Church. The Tudors, it turns out, had PR savvy.

My best guess is that the institutional reformation of the Catholic Church, opening up the ranks of the clergy to public scrutiny, will take a good long time. An meritocratic aristocratic form of government, like the Catholic Church, is also a very conservative form of government. Response to the contingencies of an institutional crises must be negotiated in the context of authority structures that privilege commitment to the established order. Critics of the Catholic Church forget what sort of continuing influence a long communal memory can have on a present state of affairs.

Nor is this necessarily a bad thing; though priorities must be clearly kept in mind. When Caesar has to point out sin in ecclesiastical ranks, there is a problem.

Monday, February 18, 2013

Idealism; Or Thinking the Thoughts of God

Idealism is the name given to the thought that every thing is a thought. It's a counter-intuitive way of thinking about things. Usually we assume that the things themselves are one thing and our thoughts about things another. You can think about a tree, for example, but you assume the tree itself is not in your head. Which makes sense a certain amount of sense, right?

This is the average everyday way of thinking about things--as if they aren't in your head. It's also the way that someone like Aristotle thought about things. The ancient Greek philosopher thought that things were composed of form and matter. When you thought about a thing, you thought about its form. The matter of a thing, it's body, was the stuff that didn't get into your head. The dual form-matter composition of things meant that things could be both inside and outside your head, at the same time. An idealist, however, wants to points out that the thought a person has about a tree and the tree itself are actually the same thing: a thought about a tree.

Now how does that follow? Notice how when you distinguished between the thought about a tree and the tree itself, it was you who distinguished between the two. You never stopped thinking. You thought about the thought about a tree, and you thought about the tree itself. And lo, you discover the tree itself was actually your thought about a tree. (Try not to think about this too hard. Just accept it as true. It'll save you a lot of headache and pain.)

The idealist seems to suggest that the tree is nothing more than a thought in your head. That seems to be the consequence. Well, yes and no. Yes, on the basis of the point made above, that the thought a person has about a tree and the tree itself are actually the same thing: a thought about a tree. But no, because even the idealist recognizes that the reason we feel compelled to distinguish between thoughts about things and the things themselves must be given some explanation. 

Part of the problem, the idealist thinks, has to do with what is meant by 'your head'. While different idealists employ different sets of terms, a idealists generally tend to identify two sorts of consciousness, one finite and the other infinite. Finite consciousness, the one that draws distinctions between thoughts and things, is 'the head' in which people feel most comfortable. Infinite consciousness, on the other hand, embraces all finite consciousnesses--yours, mine, his, hers, ours, theirs, etc. 

At the very base, the idealist insists, every finite consciousness is merely a manifestation of infinite consciousness, in which thoughts about things and the things themselves are both the same thing: thoughts about things. On what basis they do is not always clear, but let's overlook this for a moment. Finite consciousnesses, like our own, may understand that thoughts and things ought to be the same thing: thoughts about things. Only an infinite consciousness, however, actually understands what it is for thoughts about things and the things themselves to be the same thing.

And that is what the idealist means when they say they want to think God's thoughts after him.

Sunday, February 17, 2013

The Historical Adam

Alvin Plantinga makes my intellectual hairs stand on end. For a number of years now, he has participated in the so-called 'religion and science' debates alongside atheist counterparts like Richard Dawkins and the late Stephen Jay Gould. A well-established, well-respected Christian philosopher, Plantinga offers the intelligent believer's perspective. And yet, every time he opens his mouth or puts his pen to paper, I have to wonder whether he has actually listened to his conversation partners.

Plantinga has contributed a short piece to a Think Christian series on the historical Adam, making four basic points:
1) Science is not infallible; the "current" theories are always changing.
2) Serious Christians, going all the way back to Origen, have doubted whether the first chapters of Genesis are to be read literally.
3) What a Christian is allowed to believe regarding the first chapters of Genesis must be constrained the sorts of effects particular readings will have on other doctrines. If there's no historical Adam, for example, does that prevent you from affirming a historical fall? And if there is no historical fall, does that prevent you from affirming the necessity of Christ's incarnation and atonement?
4) And finally, it's possible, somewhere between 160,000 to 200,000 years ago, God selected a pair of proto-humans and imbued them with the characteristics essential to being 'created in the image of God'. Shortly thereafter, they fell into sin. All of their offspring, namely modern humans, can thus be said to be created in the image of God, and fallen in sin.
What I find so frustrating about Plantinga's suggestions is that he manages to do is insult the intelligence of evolutionary biologists by pandering to their way of thinking about things. The one thing he will not say is that the scientific study will not yield evidence of a single, original human pair--which it will not, not in a million years. Evolution takes place across populations, not in individuals.

To claim, as Plantinga does, God stepped in and added 'his image' to a creature's make-up at a point late in its evolutionary development is an empirical, scientific claim. Or, at least in principle, it should be. If God has done this thing, the fossil record ought to reveal something. Then again, maybe the fossil record does reveal something. Maybe human intelligence, that 'thing' that is supposed to set us apart from the rest of God's creatures, is that ingredient that God adds. Okay. Richard Dawkins preemptively lambasted the fourth claims years ago, pointing out that empirical claims are subject to verification and falsification. But God's actions (not that Dawkins believes in a God who acts in human history) are not verifiable, which means Plantinga engages in idle speculation with no scientific currency.

For reasons I stated a few days ago with reference to ancient narratives of a Great Flood, I am inclined to agree. Plantinga wants to paint a scientifically plausible picture, one which irons out the creases and fills in the gaps between two very different sorts of evidence. The direction of his argument is towards fitting the biblical narrative into a natural historical narrative of the history of life on this planet. The direction of his argument, however, should be towards clarifying the different sorts of claims being made.

One thing we do know with a fair degree of certainty is that the creation narratives were never written with the evolutionary biologist in mind. That's a good place to start. Another thing we know with a comparable degree of certainty is that an ancient creation narrative describing a set of moral relations between plants, animals, human beings, and a whole range of other natural objects is not likely to be saying the same things texts making sense of the evolutionary history of life on the planet, which relies on evidence observed in the fossil record or encoded in genetic material. That's a much better place to start, since it underscores the basic difference between these two modes of inquiry.

Thursday, February 14, 2013

Comparative Religion

Many contemporary scholars would balk at the suggestion that a comparative study of religions is even possible. The suggestion rests on an assumption there exists a common thread working its way through the world's great (or not so great, depending on your perspective) religious traditions. That assumption holds the thread can be drawn out of all religious traditions, including Buddhism, Confucianism, Hinduism, and also Judaism, Christianity, and Islam, without doing significant violence to the tapestry of any one tradition. A tall order, to be sure. Maybe even impossible to complete, since each religious tradition is itself host to a wide varieties of expressions within its boundaries, a variety that has sometimes even been generated through an encounter with another great religious tradition.

But is it, in principle, impossible? In other words, are the great religious traditions essentially incomparable, even if superficial points of comparison might appear? Note carefully what is being asked. Were they essentially incomparable, this would mean anyone who got themselves stuck in a way of thinking that they were comparable would judged to fall between being intellectually naive and morally reprehensible. (Damn colonizer! colonizing the world with your intellectual discourses...) The stakes are naturally quite high.

Those familiar with Judaism, Christianity, and Islam, for example, might be tempted to make the following observations. Buddhism has no supreme God. Does it count as a religion? Hinduism has so many gods it has been confused with Greco-Roman polytheism. Does it count as a full-fledged religious tradition? Confucianism looks more like a social ethic, holding to a vaguely Socratic sentiment about why one would want to know about the gods and the afterlife if you do not know your own self. Does it count as a religion? This line of questioning loads the dice in favour of certain traditions over against others. Those traditions that profess belief in a single, exclusive, supreme deity are judged superior over those tradition that do not. This is a sure way to be accused of intellectual mischief, if not by your peers, then by your successors. Not a few Western, and usually Christian, scholars have fell under this condemnation.

Over on the Eastern side, the supposed exclusivity of the Western traditions is particularly problematic. Is divine truth so mundane that it can picked up and tossed about with ease by mere human beings? Does it reflect a pious disposition towards divinity for a person to seize hold of some "truth", some statement, some intellectual formula, and assert it as divine truth? The divine order presumably doesn't speak a human language; it also remains a question whether it speaks, in an active sense, at all.

The lesson I believe it is important to keep in mind is that the best way to walk through a minefield is with one's eyes open. (Let's set Confucianism aside, since it introduces a rather different set of considerations, to focus on two of the great religious 'families'.)


The Professor of Comparative Religion, who I have worked with now for a few years, Dr, Arvind Sharma, proposes a very general rubric to compare Abrahamic  monotheism and Indic monotheism. The later term he stretches to include Hinduism, Jainism, Sikhism, and also, surprisingly, Buddhism.

The comparison proceeds as follows: Abrahamic monotheisms generally hold that God is distinct from the world of human experience, on account of being its Creator; and holds that all other gods are false gods. The Ten Commandments ("You shall worship no other god..") and the Shahadah ("There is no god but God...") both bear out the general characterization. Compare this to Indic monotheisms, which tend to hold that God and the world are ultimately One, and holds that all other gods are ultimately manifestations of that One. Hinduism holds to the ultimately unity of all things ("Brahman is Atman"; the soul is all things), while Buddhism is merely its metaphysical inversion ("Nirvana/Anatman"; no soul is in no thing, or all things are empty.)

Here we seem to have a common thread working its way through a number of religious traditions: an admonishment to not let the "external", material things (wealth and possessions, the praise of others, authority, etc.) cloud one's judgment. The Abrahamic traditions caution against fashioning of gods for oneself, while the Indic traditions peel back the veil covering over the 'relative' worth or 'emptiness' of the same. The messages are roughly approximate; they regard knowledge of one's 'self'. Should you have in your head that your accomplishments, natural attributes, possessions, or some other type of personal accessory elevate you above your fellow human beings, you are a sad, sorry, sack of...foolishness. Our common end, in bodily death, lays bare all that is pretense and posturing.

I make some observations, by way of clarification, about Dr. Sharma's very general taxonomy of religion. It seems to me that what is being compared are not differing conceptions of God or divinity, but differing conceptions of human nature--whether there is one, what it entails, and so on. What is not being described is an utterly transcendent divine order; but rather the relation between a divine order and humanity, or more specifically, how human beings understand themselves in relation to the divine order. Or, even more specifically, human self-knowledge.

What I mean to suggest is that there is a very general, and essentially comparable, difference between how Abrahamic monotheisms and Indic monotheisms conceptualize human life in time. Abrahamic monotheisms tend to hold that time is finite, with beginning and end, and also that human beings gets a single chance to live their life on God's green earth. On the other hand, Indic monotheisms tend to hold that time is an infinite, endless circle, and also that human beings go around and around the karmic wheel of birth, death, and rebirth. In both cases, the human being is the same human being we are all readily acquainted with (being human beings ourselves). But the nature of human life, where it comes from before birth and where it goes after death, is radically different.

Hence the difference, it also seems to me, between a Abrahamic understanding of salvation and a Indic understanding of liberation. If one understands oneself wholly cut off from the divine order, which is implied in conceptualizing God as Creator, it will take divine aid to 'return' to a right relation with God. Whereas if one understands oneself as an immediate participant in a divine order, which is implied in conceptualizing God as ultimately the same as the world, it is pointless to speak of divine aid in the process of achieving liberation. The human being is, always has been, and always will be God.

Tuesday, February 12, 2013

Thomas More's Modest Utopia

Utopian visions have bedeviled modern European society, at least since the publication of Thomas More's Utopia (1516). Without having done too much digging around in reference texts, I feel safe drawing the assocation because More's novelty was to have coined the word utopia, meaning 'no-place', and also a pun on a word meaning 'good-place'. More most likely intended that both meanings be taken seriously: as great as utopia might have been, it wasn't meant for anywhere in this world.

The 18th century Prussian philosopher, Immanuel Kant famously stated, ought implies can, such that if I ought to do something, I also should also have the means to see that something done. And vice versa: if I cannot do something, it is not just to insist that I ought to do it. For utopian visionairies, the implication was, if it can be described, it can also be built. The mental image of a utopian existence, in principle at least, ought to be translatable into the bodily reality that you and I share with the rest of humanity.

Utopian schemes plagued the modern era and left human bodies strewn about in their wake. The old order of Judeo-Christian restraint was overthrown by a growing conviction, captured perfectly in Kant's statement, that ought implies can. It is, after all, integral to the Gospel message that there are things human being ought to do they can hardly attempt without divine aid and will never completely accomplish in the present life. Most notably: an especially expansive understanding of love our fellow human beings. However you flesh it out, the teaching of human sinfulness is a real drag on a can-do attitudes. Smaller scale utopian projects were harmless enough, so long as they leached off a larger society, which utopian citizens could retreat to after their inevitable failure. On the scale of states, however, utopian projects became the meat grinders of humanity. Nice on paper; terrible in practice. If ought implies can, and I happen to control the government and military, what's to stop me from attempting to impose a utopian order, by whatever means necessary, on a nation in the short-term?

Thomas More's account of the namesake Utopia, against this dismal backdrop, is all the more impressive for its modesty, and for the author's own reticence to endorse grand attempts at reforming society from the top down in the text. His better known successors in the genre, Bacon and Campanella, described island states centered on the fruitful, albeit clandestine, pursuit of knowledge to the ends of the known world. A prototype of the modern research institution, with a library, or some record all all humanity's knowledge, stood in the middle of town. All the citizens contributed, in their own ways, to the extension of humanity's utopian trust.  A pleasant dream for impoverished academics, no doubt, who otherwise scrape by around the edges of wealth, power, and influence.

More's account of the 'good place', which was also 'no place', attempts to answer one question. Under what conditions will the vast majority of people be content? Good government, for starters. (Where haven't we heard that before?) Government had to be concerned with meeting the material needs of citizens, requirements for physical activity, and mental endeavours, as well. This meant that government could not pursue ends not relating to the citizens own welfare. No one should get rich off the backs of the poor. Everyone should have a house in the city, the oppourtunity for a family, the chance to advance in society with age, and be apportioned certain periods of time down on the farm.

Some critics have accused More of promoting a primitive version of communism. There are certain communist features in Utopia, but if the criticism remains there, it misses the brilliance of More's work.

The society More described was a near-perfectly functioning organism. It had provisions for the defense of its borders, but no real need to employ them. It also had barely a hint of law enforcement, nor even a real need for the exercise of leadership in even the most mundane affairs. Every single citizens seems to have been thinking out of a single mind, with a single purpose. The prince, or magistrate, of the land played the part of a mere figurehead.

More inserted himself in the story of how knowledge of Utopia reached England. His character concludes the work thus: 'I cannot agree to everything...related;  however, there are many things in the Commonwealth of Utopia that I rather wish, than hope, to see followed in our governments.' More's utopian vision is not yet about the knowledge and utilization of the natural world; but rather a well-lived life.

Human life would be especially sweet if we all just agreed with each other. But we don't; nor is the enforcement of agreement, by our common wisdom, a better thing the tolerance of disagreement. Ought does not yet imply can east of Eden, which means More only wished that things might be, and does not hope to see them accomplished.

Wednesday, February 06, 2013

Culture Making

I hate the word culture. My skin crawls to hear other people use the word flippantly, as it frequently is. It exemplifies all that is wrong with the world. Talk about culture is a talk of mutually exclusive absolutes: you have your culture, we have ours; we live on our preserve, you live on yours. As a consequence, and this should come as a surprise to no one, talk about culture also happens to be the language of relative truths.

Culture denotes anything that is not rational, questioning, and discriminatory; anything that has become habit, and slipped beneath a critical gaze. Culture stands in for things we share, our feeling of solidarity, the way things work around here, or at least the way they ought to. Encultured beings all of us, we turn and reflect on our shared culture only after it has become sedimented into our ways of relating to each other. We wonder what this or that most recent turn of events might mean for our culture; but without the advantage of hindsight, we are, at best, stabbing in the dark. But insofar as we are, we are all starting from the same place, travelling the way of all flesh, ending up no better off than when we started. That's the gods-honest truth of it.

Contemporary talk about culture goes terribly wrong by placing talk about culture within the framework of change and empowerment, of getting things done. Those who talk the language of culture-making or culture-creating or culture-shaping talk down to their audience, not with them. They demand obeisance from their audience; they do not encourage participation. Its a new language that masks over what Saint Augustine called the libido dominandi, the desire to dominate; or what Nietzsche, in much simpler terms, called the will to power.

Formerly, culture was something the upper class had--fine arts, good wines, impeccable manners--the boorish lower classes did not. Then we learned about class consciousness from Marx, and it became obvious that everyone had culture. We realized the word was better used without an implied moral judgment about social inferiority. The language of culture-making, however, is something quite different. One only has to consider the material from which culture is manufactured, shaped, moulded. It is not the material of typical cultural artifacts like wrist-watches, computers, famous pieces of artwork, or low-brow examples of cultural kitsch. No even by a mile. The material of culture-making is nothing other than human beings, to be made and moulded at will and whim.

The idea that Northern American society finds itself locked in a culture war should be brusquely dismissed. Our bourgeois existence has become so decadent that we must conjure up enemies to populate our imagination. There is no war; no maimed bodies lie strewn about, nor do tortured souls wander desolate streets crying for loved ones lost. Certainly violent crimes and even socially-sanctioned forms of violence exist; but these interrupt the social order, rather than disrupt and dislocate it. Instead, what there are, are bruised egos, spoiled children, who have been whipped up into a frenzy; but who are still able to go home at night in relative safety.

In the grand scheme of things, a minor readjustment of guiding principles of a community is serious business, no doubt, but that doesn't garner the difficult process by which the adjustment is made the title of war.


Take Peter Stockland's wonderfully written juxtaposition ('This is Ultra-Tolerance') of the moral culture that was, when a mother could march straight up to a pair of amorous strangers and shame them for not abiding common codes of decency, with a moral culture that is, where neighbours must tolerate S&M roll-playing in the front yard. You may agree with everything that Stockland has to say--which I do, almost down to the dot over the last 'i'. But you are still left with a question about how to live with neighbours that don't conform to your moral outlook. The nostalgic note Stockland sounds for the days of his youth doesn't yield a positive direction, and neither does moral condemnation.
Avoid surprise: expect the worst. Expect, in other words, that human beings are simply slaves to animal passion and lack any capacity to consider first their public obligations to you or anyone else. This is ultra-tolerance. This is the true politics of the day.
Is this an example of war? Hardly; unless sniping from comfort of one's living-room couch counts. It is difficult to discern, in this limited forum, whether Stockland thinks engaging in the politics of the day is of any value. My impression is that he would have us put on white gloves entering the public square and burn them after leaving. I readily grant, though, that my reading may be skewed by the fact that Stockland is obviously writing to an audience he believes like-minded, and is not talking with the roll-playing S&M neighbours. The absence of any positive suggestion is nevertheless conspicuous. Easy to criticize; but much more difficult to propose a course of action, all the while not becoming a tyrant.

The choice of title for Stockland's piece is ironically self-enabling, to be sure; ultra-tolerance always goes two ways.

Tuesday, February 05, 2013

Debt Peonage

The last weekly installment columnist Chris Hedges has posted on offers an insightful criticism of social inequities in a world increasingly reliant on borrowed money:
Miss a payment on your credit card and your interest rates jumps to 30 percent. Fail to pay your mortgage and you lose your home. Miss your health insurance payments, which have been spiraling upwards, and if you are seriously ill you go into bankruptcy, as 1 million Americans who get sick do every year. Trash your credit rating and your fragile financial edifice, built on managing debt, collapses. Since most Americans feel, on some level, as Hudson points out, that they are a step or two away from being homeless, they are deeply averse to challenging corporate power. It is not worth the risk. And the corporate state knows it. Absolute power, the philosopher Thomas Hobbes wrote, depends on fear and passivity.
As a Canadian, I am told by government and corporate analysts (usually on CBC's Power and Politics or The Lang and O'Leary Exchange) the situation north of the border isn't quite so dire. I am inclined, however, to take any such claims with a healthy dose of skepticism. The fine art of economic prognostication only ever rarely attains the exacting standards of accuracy for a scientific statement, and this is only determined in hindsight. Each time I hear the phrase 'Going forward...' or 'Looking forward...' fall from a person's mouth, I make a mental note to myself that this must be another graduate fresh from the Hogwarts School of Business.

I more than sympathize with Hedges' account; in fact, I very much identify with it. A low, but persistent, level of anxiety clings to my every financial calculation; a holdover, I think, of memories growing up in a family whose agricultural livelihood was directly exposed to the violent disruptions of the market. Will there be enough money to pay rent next month and make my other bills? the month after? are questions I must ask myself every two or three months... My wife, Sabrina, worries about me when I get anxious, but keeps a cool head herself. Our different reactions seems to come from our differing experiences growing up, as she grew up among people who managed the ups and downs of the marketplace, rather than being immediately exposed to them. (A real wrong-side of town romance, if there ever was one.)

Divergent perceptions, however, do not cancel out the reality of things. And that would would be..what exactly? Hedges makes an able case, I think, for the reality of 'debt peonage'. Despite all the anger and stress created by a economic system designed to extract ever last penny, compliance is enforced by the amount of debt persons are forced to take on, to gain access to the system in the first place. It appears a vicious circle; and all the more so because there is no real option to opt out.
There are no impediments within the electoral process or the formal structures of power to prevent predatory capitalism. We are all being forced to kneel before the dictates of the marketplace. The human cost, the attendant problems of drug and alcohol abuse, the neglect of children, the early justified by the need to make greater and greater profit. And these costs are now being felt across the nation. The phrase “the consent of the governed” has become a cruel joke. We use a language to describe our systems of governance that no longer correspond to reality. The disconnect between illusion and reality makes us one of the most self-deluded populations on the planet.
Corporations are disempowering citizens, which only seems to be confirmed by the fact that corporations, through a legal farce, are now defined as persons. The winner takes all in a game where corporations both deal the cards and sit at the table as an active player.

And that, it seems to me, is the problem with Hedges' otherwise insightful column. He buys into the corporate nonsense about corporations being persons by personifying them in his criticism of corporate America. All of what Hedges has to say about the contemporary economic situation may be true, but none of that is to suggest there is actually a conscious entity willfully extracting wealth from the downtrodden. I say this even if advertising campaigns are all about separating the customer from their money.

The easiest way to demonize something is to personify it. More likely, the employees of corporations, from the CEO on down, are fulfilling the mission of the corporation to provide a better product or service, and make a profit doing it. In their interactions with governing officials, they will no doubt seek certain advantages to assist in the achievement of that aim. But this is still a long way from a conscious, explicit intention to enslave the greater portion of the citizenry.

Once a system is personified, it becomes all the more difficult to reform it. Hedges' talk about debt peonage sounds quite medieval: there is nothing redeemable in it. But debt has been and still can be a powerful financial tool, one should keep well in mind, giving access to capital to persons who would otherwise be left outside.

Monday, February 04, 2013

Genealogies were once the province of nobility. A clear record of who married whom and who fathered/birthed whom was necessary to maintain a lineage, with all the economic, political, and social things entailed therein. Among the upper echelon of nobility, the royalty, genealogical research was all the more important to preserve the stability of kingdoms. Where the heir was clear, subjects had reason for cheer.

Today, however, just about anybody can research their family history. A sure sign of the steady democratization of knowledge, the old concerns of inheritance and precedence are largely absent as motivating factors. Sheer curiosity prompts many of us to inquire. Likewise, a sure sign of the capitalization of knowledge, publicly-traded, for-profit companies like (or in the United States: sell access to large government and other public databases, and provide an attractive online platform with which to organize the information.

Not all is as straight-forward as it seems. On the website's main page, visitors are greeted by a hallmark of advertising campaigns: a hyper-inflation of consumer expectations. The visitor is told: 'Your Family History FREE FOR 14 DAYS: Sign up now'. The link confronts visitors with a repetition of the inflated promise: 'Discover your story...' What the website actually proposes to for paying members is written in a much smaller font size. 'Original images of immigration records, military files, historical newspapers, census records and more are waiting to be explored.'

Does any of this actually count as your history? It's an interesting question on which to reflect. The historian E.P. Thompson* remarked that every historian needs to be aware of the deep divide between personal memory and the information learned from books and the contents of archives, which trails behind humanity at a distance of about a century  If you don't pay close attention to the difference between those things learned by word of mouth from family members and friends and those things learned from an archival resource, the tendency will be to allow the former simply to bleed over into the latter. Personal prejudices are likely to become writ large on a world stage.

At the same time, it's that prejudiced personal aspect that makes the history your history, and not merely your study of the history of other people. Ancestry may be able to show a baptism or immigration record of a grandparent; but that doesn't mean these are yet your history. These records only become so when you add memories shared with living family members into the mix. Why did the grandparent emigrate in the first place? What did they encounter when they arrived? When did you become a remote twinkle in someone's eye? Ancestry runs a commercial on the CBC News Now network that cleverly propels viewers past the all-important difference in the name of technological progress. The scene presented is of a elderly father and a middle-aged son, who wants to take up the father's mantle as family historian. But he wants to do it his way, by which is implied, he wants to take advantage of new technologies. And, not surprisingly, Ancestry is there to sell him their product.

I won't claim that Ancestry might not offer something valuable to some of its customers. On the other hand, I do think it misrepresents itself by claiming to offer more than it can--at this point, at least. It would be interesting to find, as users build Ancestry's database, whether actual human stories, and not merely sets records, begin to fill in the gaps, says, between the record of a person's birth, military service, marriage, and death in ways that significantly aid non-relatives.

Questions should also be raised, as they are for many other social networking platforms like Facebook, Twitter, Instagram, and so on, about who knows the information stored on Ancestry. It would be rather troubling to see the bits and pieces of personal history users are able to assemble held hostage by for-profit commercial interests.

*Correction: Eric Hobsbawm

Sunday, February 03, 2013

The Technology of Reading

It turns out the internet is changing the way we think about things, how we interact with each other, and what we expect of ourselves. Indeed: how could it not? Only a Cartesian was ever able to convince themselves that the materiality of bodily existence had little or nothing to do with how we think. The ancient Greeks and Romans were well aware that a world full of objects, not excluding capricious deities, continually poked and prodded the psyche. Medieval Christians were also aware of the lusts of the flesh, and only too aware of how they could upset the mental balance of the most devout believer.

Whereas modern thinkers have had a certain amount of difficulty shaking their Cartesian hangover. It still shows in how questions about the effect the internet is having on the way we think are asked. Nicholas Carr pointedly asked, 'Is Google Making Us Stupid?' invoking the literary authority of Stanley Kubrick's cinematic rendition of Arthur C. Clarke's 2001: A Space Odyssey to make his point.  The lesson of the death (better: dismantling) of HAL 9000 gets at the rapidly disappearing distance between human and machine: 'as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.' Our reading has become nothing more than superficial glossing, and then our attention wanders...which gives us licence to that something unprecedented in the history of literary consumption is occurring?

I suspect our Cartesian hangover encourages this sort of wild speculation. Once we cherished dearly the thought that the mind was entirely separate from the materiality of bodily existence; now we worry about the erosion of the walls of our mental isolation from materiality, which (probably) never existed in the first place. Though there is a significant difference from earlier accounts of the intimate relation between mind and body: the new materiality that threatens to envelop us is not just any natural material. It's our own technological creativity that threatens. Inverting an the old image of a person lifting themselves up by their own bootstraps, we are now swallowing ourselves whole.

Everyone needs to calm down and take a page out of the history of reading the biblical texts--or any canonical or venerated scriptures, for that matter. The organization of these sacred collections follows the technological means available in a given time and place.The Bible as we know it today was only formally collected in its present form in the 13th century, and and only made widely available in the 15th century, with the invention of the printing press.

With the introduction of the printing press, how people read changed fundamentally, albeit over the course of a few centuries. (Technological change moved more slowly in the past.) A Late Antique or medieval Christian reader of the Bible would have spent hour contemplating the words of the text, like Saint Augustine describes himself doing, pouring over them in search of God's truth. The reader approached other texts with the same sort of reverence. Antiquity bestowed authority upon the written word. Readers could only assimilate the messages texts contained.

Fast-forward to the 15th and 16th centuries: European scholars are buried neck-deep in cheaply produced publications. This changed technological context is an obstacle to the sort of immersive reading practices of early generations. Which is not to say that scholars failed to immerse themselves in texts; only that they did so while reading a great number of other texts besides. The natural impulse was now to examine, compare, criticize, and raise questions as to the veracity of this or that passage in Herodotus, for example, or Pliny, or one of the five Books of Moses.

And most importantly, the location of truth changes. Truth is no longer necessarily found in any one text, even though a text like the Bible will still be called 'the Word of God'. The truth is now located, as it were, between texts; or more precisely, in the critical faculties of discerning readers.

The image of Francesco Petrarch`s  indecisive wavering at the crossroads where the active and the contemplative life parted ways in The Ascent of Mount Ventoux is an important antecedent. Lorenzo Valla's criticism of papal claims to authority on the basis of The [Spurious] Donation of Constantine provides a more poignant example of a changed attitude towards texts. The efforts of Renaissance Humanists and Protestant Reformers to publish critical editions of the Ancient Greeks, Romans, and the Fathers of the Church all belong to the beginning of a long process of steady erosion, at the end of which the authority of the text as text will vanish.

The transition between these two eras was no doubt unsettling for many. About any private doubts that a text's apparent loss of sanctity may have raised, I have nothing more to say. My point is only that things have changed in the past, and will no doubt do so again in the future. No need to lose one's head over them.

Saturday, February 02, 2013

What Qualifies as a Religion?

Ever wonder what people mean when the use the word 'religion'? It turns out that it's not always clear.

Around the same time European merchant and military adventurers, especially of the sturdy British variety, were placing the rest of the globe under imperial rule, European scholars were rudely awakened to the emptiness of their cherished intellectual idols. The best scholarship to date had categorized religion in terms of three or four basic forms: usually Jewish, Christian, Muslim, and pagan. Judaism was somewhere in the mix, but didn't always qualify, or so it was thought, for independent treatment.

Then Europeans got wind of the ancient civilizations existing in India and China, and also the great literary traditions associated with Buddhism, Confucianism, and Hinduism. Some of the more perceptive of these men, and they usually were men, realized the old categories didn't actually match the reality of the situation. The newly discovered depths of these ancient religious traditions--for, presumably, that was what they were: religions--fit, by default, into the category of inferior pagan traditions, on par with the Ancient Babylonia, Egypt, Greece, or Rome. With its many gods and goddesses, Hinduism seemed to conform to the old pagan pattern. The civilization and literary tradition associated with it, however, suggested anything but inferiority. In some ways, its content even suggested a greater maturity than anything Christendom had produced.

Buddhism and Confucianism were even more difficult to press into the old way of categorizing religions. Adherents of some stripe of Christianity, European scholars tended to presume religion involved the worship of a transcendent deity, a singular being whose existence comprehended the existence of all other beings, and went by the non-descriptive title God. While certain varieties of Buddhism had gods, of course, it quickly became clear that the Buddha himself, and the mainstream of Buddhist teaching, taught that the pinnacle of existence was characterized rather by a profound Emptiness, or Nirvana. Likewise, Confucius' insistence that human beings ought not speculate about things they could not know, but focus on things that were within their mental grasp, like the the ills of society, also reminded European scholars of the ubercritic, the ancient Greek philosopher Socrates.

At the end of the 19th century, Freidrich Max Muller stated the dilemma like this: We have believed that religion was characterized by worship of gods and God, all personal deities. If this definition is always everywhere true, then Buddhism and Confucianism are not religions. And, as a consequence, we seem to need to find some other name for them.

A few Europeans scholars took up and defended the latter proposal. The intellectual inconveniences the rustic Hebrew texts had saddled their interpreters could be swept aside, they believed, and a new rational, uniform system put in its place. And Buddhism, for example, showed us how to do that. Or a fresh look at social ills as social ills could be taken, and not as supposed expressions of God's wrath towards moral failings. And Confucianism, the great humanist doctrine that it appeared to be, showed us how to do that.

These attempts to redefine the eastern religious traditions would not carry the day. The ancient traditions ended up looking a little to modern and secular than saner minds deemed possible. Later generations saw that their teachers had co-opted the great eastern traditions in order to criticize the great western traditions. Using intellectual weapons fashioned half-a-world away that they hardly knew, they criticized things they thought the knew only too well. Belonging to polite society, of course, the charges were never put in quite such stark terms, but the implications are there, hovering in the background.

Muller had the wherewithal to see that minor intellectual skirmishes, for example, over whether Darwinian evolution could be made to fit with the first chapter of Genesis, whether scientific minds could any longer believe in miraculous events, or how a good God could allow evil to exist, were nothing more than intellectual vanities--which also made them distractions. Religious traditions may yield theoretical dilemmas, but they are not discredited on the basis of an apparent logical inconsistency.

The problem, Muller realized, was that scholars had sought a definition which would fit their their particular understanding of how divinity related to the world. If certain instances of religion failed to fit into the definition very easily, this was usually explained away by calling the religion false.

The better way was not to place oneself in the seat of divine judgment; the better way was to listen and watch human beings articulate how they they ordered their lives and their worship. Instead of placing God at the center of his inquiry, Muller placed the human being looking out and up towards the infinite extension of sky beyond the horizon, where the eye does not reach, and down and in at the infinite depths of the self, where the mind cannot go.

And he wondered about what sort of being it was that could ask questions that had no apparent answers--which, as it turned out, gave him a definition that applied in all instances.

Friday, February 01, 2013

Smoggy Days

Glancing through pictures of an obscured Beijing skyline provides a sobering perspective on the local Walmart and Target discount stores, or any product that has 'Made in China' stamped on its exterior. The Atlantic has posted a rather stunning collection, and I recommend to you number 9, especially, as a good example of visual irony. The workshop of the world, it seems China must also therefore be a tailpipe on the engine of economic progress.

It's not as though the Western world is unfamiliar with coal-fired smog. The Great Smoke (or Smog) of London in 1952 lasted four days and killed 4,000 and made 100,000 more ill. More recent research, examining complicating factors, suggests the actual number of fatalities lies north of 12,000. The Great Smoke was the last of the 'pea-soupers', which plagued London through the 19th and early 20th centuries.  On this side of the Atlantic, Los Angeles and Mexico City both have notorious records through the middle of the 20th century. The city of New York also had its problems during the 1960s.

The narrowly defined needs of economic development appear out of step with a much larger set of concerns that impinge on human life--for example, the need to eat, sleep, and, in this specific case, to breathe. These later concerns don't sound as if they are larger, of course, and that has something to do with the way we have been taught to think about ourselves in relation to others. Breathing seems a rather minor affair, by comparison to the number and volume of transactions on the NYSE trading floor or the year-over-year growth of the Chinese GDP.

The way we have been taught to think about ourselves owes something to a titanic debate between the 'ancients' and the moderns', which, on account of being modern, we have largely forgotten about. Specifically: over the definition of 'material'.

For the ancients, material was the stuff that never got into a person's head. One could think about a tree; one could imagine different things to do with a tree. The idea of a tree was in a person's head. But the tree itself, it's bodily, material existence, was out there in the material world, never completely assimilable to human purposes, because it could never be completely comprehended in thought.

For the moderns, the idea that bodily, material existence was somehow beyond the human mind became more and more difficult to mentally digest. Astute commentators have noted that both Adam Smith's account of capitalism and Karl Marx account of communism share in common a 'materialist' basis. But here 'material' refers to the human activity of 'rationalizing' labour processes to maximize economic productivity and 'material' prosperity--or, as Marx would so aptly put it, 'material activity'. In principle, nothing escapes rationalization. What does escape is labelled 'false consciousness'.

The difference between these two definitions of material is obvious. The ancient definition admits that something always escapes the grasp of the human mind, stubbornly evades our best efforts to reduce it to a simple formula, while the modern definition forgets the same. Every once in a while, however, that something rears itself ugly head in very immediate and tangible ways. It does in John Brunner's perceptive 1972 novel The Sheep Look Up, which attempted to describe how a society on the edge of environmental collapse might function. And it is now doing so now in China.

It becomes apparent, in the process of creating artificial forms of material wealth, we cannot simply assume the natural materials--air, soil, water--needed to sustain human life will always be there.