Monday, April 22, 2013

Teaching Philosophy

The study of philosophy, or the study of what other people have said that gets categorized under the heading 'philosophy', has the potential to leave one feeling a perpetual student. Expertise can be acquired in a so-called "field" of study. Along the way, however, students will have realized their "field" of study looks less like an actual field and more like a narrowly defined section of a bookshelf in a stuffy library. They will have also realized that one never finishes studying philosophy.

It's an interesting thought experiment to put the shoe on the other foot. What if you had to teach philosophy? Where would you start? Any student can give a non-committal answer questions about what they are studying. They are on their way to knowledge, and, in any case, there is too much to capture is a single phrase. On the hand, a teacher lacks that bohemian luxury. They must say something

The first thing I would do is set aside any latent Socratic inclinations. The Ancient Greek philosopher Plato cast his teacher, the famous Socrates, in the role of questioner, leading his conversation partners to the realization of truths they already knew, but had quite been able to articulate on their own. My approach would include a discussion of a famous philosophical text. We would have never known Socrates, after all, unless Plato of cast him as a character in his philosophical dialogues. Regardless how much talking philosophers do, they are always falling back into texts, each of which is going to preserve a small portion of a tradition of reflection that never quite comes clearly into focus.

More to the point, then, which text would you start with? And let's say, for the sake of argument, it could only be one text. Not one thinker. Not one philosophical school. Not one series of texts. One text.

My choice would be Rene Descartes' Discourse on Method (1637). There are a few reasons why.

First, the text is relatively brief. It contains six chapters in total, all of which can be digested in a single sitting, or over a number of sittings without much difficulty..

Second, the purpose of the text is to take a position on anything in particular, but to introduce the reader into a way of thinking about things. You are invited to follow Descartes on his philosophical journey, to think through why he came to the conclusions that he eventually did.

Third, Descartes' philosophical observations are woven through a personal narrative allowing for historical commentary. That means a teacher can put "flesh" on the bones of the argument, placing arid speculative suggestions in a more recognizable human context.

Fourth, the Discourse contains recognizably contemporary intuitions about the nature of things. It takes the form of a personal narrative. It is playful experimental with the ideas it presents. It wonders about the nature of the human self. And, most importantly, God has been displaced from the center of inquiry. The entire world of human experience is no longer assumed to come from God and return to him. (It still does, of course. Descartes holds God to be the Creator of all things. He is not willing, however, to start with that as the presupposition of his inquiry.)

The basic argument of the Discourse is that all those things Descartes had formerly held to be true he had discovered many reasons to doubt. His experience of violent and destructive discord among different Christian sects during the Thirty Year's War had lead him to seek a more certain basis for knowledge. The revelation of God could not be trusted, as it was mediated by human beings. The same went for the teaching of the schools, by which was meant the abstruse logic-chopping arguments of the late medieval world.

So Descartes resolves to doubt all that can possibly be doubted, and in the process doubts no only what other people have told him, but also what his own eyes and ears "tell" him. He even suggests, for the sake of experiment, it is possible to conceive of oneself existing without a body. After "emptying" his mind of all those doubtful thoughts, Descartes arrives at the one conviction he cannot shake: I think therefore I am--"that is to say, the mind by which I am what I am, is wholly distinct from the body, and is even more easily known than the latter, and is such, that although the latter were not, it would still continue to be all that it is."

All of this is ripe for discussion. Is the format of philosophical dialogue effective in conveying the author's intention? To what extent is one's thinking conditioned by one's lived circumstances? What would it mean to begin one's thinking with God rather than oneself? and vice versa? Is it possible to doubt everything? Can we ever be certain of anything? What might it mean that our minds our completely distinct from our bodies. Does any of this even makes sense?

The closer you look at a text like the Discourse, the more perplexing it becomes. Descartes is perhaps best described as not quite modern enough. Which makes his Discourse the perfect choice to begin teach philosophy.

Does anyone else have other suggestions?

Friday, April 19, 2013

Menial Labour

I am no stranger to what is elsewhere called 'menial' labour. Growing up in rural Ontario, my first jobs were both physical and monotonous. The same tasks had to be performed day in and day out. The jobs were, almost without fail, dirty jobs--especially when I was cleaning things. I was good at these menial jobs. I wasn't great at them. I could perform adequately the tasks required of me, though I was unlikely to perform them expertly or to take much of my own initiative. The sorts of thinking required to see solutions to very rural and/or blue collar problems was not in my possession.

I also have exposure to white collar 'menial' labour. The most recent bit of experience I can cite comes from last night invigilating a chemistry examination. I thought I would try invigilation out this year, so I threw my name into a pool of potential hirees. A single hour and a half training session a week in advance and a 15 minutes pep talk before the exam was supposed to make our tasks straight-forward and obvious. Then a 25+ person team was sent in a number directions, with an examination 'package' in hand, but more or less without support.

Sent to the room with the examination package, complete with examination papers and instructions for their distribution, as well as a half hour to spare, I realized immediately that someone getting paid a lot more than me had failed to assign the necessary second person to the room. Unable to raise my supervisor on the phone, I started prioritizing tasks. The examination 'circulator' eventually made their way to the room that I was in and realized much the same thing. For some reason, though, it was my fault that things weren't getting done the way they were supposed to get done.

A second person was sent to the room, a half hour after the examination had started, which had been delayed by ten minutes. Having been told repeatedly to follow every step on the invigilation instruction sheet, I relished the oppourtunity to cut corners where corners could be cut. It wasn't my fault, you see. I did the best with what I was given. If my best wasn't good enough, don't blame me for doing my best. Blame my superiors for their incompetence.

This most recent experience with white collar 'menial' labour impressed upon me the dreadful impenetrability of bureaucratic structures, in particular that of those in immediate authority above you. The experience also raised some questions, in my mind, about the exercise of authority is so proceeds so differently in a rural and blue collar world from a white collar world (though my observations would also apply to highly structured factory environment).

As I said above, I was a good worker, but not a great worker. Those persons who I worked under, whether that was in farming, landscaping, moving, or construction, seemed to understand as much. I put in long days of work, and only once or twice over the course of a decade remember being belittled for a failure or mistake. More to the point, those persons with whom the responsibility ultimately laid usually went about fixing the mess that I had made without too much complaint. There is a certain inevitably in mistakes, was the guiding sentiment. Try to prevent them, but deal with them as humanely as possible when they do happen.

I was surprised how vigorously my supervisors made it plain to me that their failings were ultimately my responsibility. There is a certain rationale for doing so, of course. In the moment, I am the one who has to perform in order for their program to be put into action. But the bureaucratic structure falls to pieces when those in charge fail to anticipate an obvious problem and also vigorously protest the smallest exercise of independent judgment in the matter. The bosses not only think you are stupid and incompetent. They treat you like it too.

Why the difference between these two sorts of bosses? It may be that what I am describing is merely a function of the size of the organization. But I have to also think it is a consequence of the sorts of materials being worked on. In the rural and blue collared trades, you work with particularly stubborn, resistant, and in every case also non-rational materials. Fields of wheat do not rebel against you, nor skids of lumber and brick talk back at you. Persons assigned to do a specific task, in highly structured, rationalized processes, on the other hand, are expected to comprehend and implement a set of instructions in very short order. They are also instructed not to think for themselves, which, if something should go wrong, has a real potential to allow things to go from bad to worse in a very short order.

So I wonder if facing stubborn non-rational resistance necessarily inculcates a very different sort of response from bosses than does facing the apparent irrationality of menial wage labourer in a highly structured working environment. Why do we expect different from persons than we do from the non-human sorts of materials that we work on? Arguably, human materials are more difficult to shape to our wishes.

Tuesday, April 16, 2013

What is Philosophy?

I have a few moments. So I want to ask the basic question: What is philosophy? Instead of answering the question, though, it might be useful to reflect on how the question might be answered.

It seems to me we have two basic options, whether due to the limitations of language, or cognitive capabilities, or both. We can say philosophy is what it is (that is, philosophy, which doesn't get us very far at all), or we can define it in terms of something else (e.g. the love of wisdom, critical thinking about X, Y, and Z, etc.). These two possibilities represent relations of identity and relations of difference. They are probably best termed strategies for analysis, not necessarily methods for getting at the truth of things. Relations of identity presuppose differences between identities, and relations of difference presuppose identities which are different. When I ask, What is philosophy? simply by using the word, I bring along a host on more or less (as yet) uninterrogated meanings.

If wisdom is identified with the divine and philosophy is the love of wisdom, for example, then philosophy is also the love of God--which raises questions about whether philosophy and theology and/or religion are so different. If critical thinking is identified with an inquirer ready to question every possible assumption, then philosophy is allied with critique, doubt, or skeptical stance towards knowledge claims--which raises questions about whether philosophy has anything in common with the dogmas of religion and theology. I don't want to ally myself with either of these definitions. I do want to observe the interrelatedness of definitions with other definitions.

Now, I have a Masters in Philosophy. However, I was warned, in a round about way, by my supervisor not to pursue a Ph.D. in the discipline. The result was that I ended up in Religious Studies, where I am quite happy teaching and thinking about subjects related to religion and its history. The reason given was that my thinking was much too theological in cast to succeed in a philosophy program. That's probably more or less true, though I ended up in a Religious Studies Faculty, not a Theology department, which was my preference.

You see, it seemed to me, for the same reason I wasn't prepared to do a Ph.D in Philosophy, I also wasn't prepared to do a Ph.D in Theology. Everyone was talking (that is, from my naive, undergrad and Masters degree perspectives) about philosophy and theology as if they were objectively describable things to be studied. With regards to theology, that made a small amount of sense, since theologians claim to be talking about something real, something 'out there', which has been mediated by scriptural sources and a long textual tradition of reflection on those scriptural sources. In the case of theology, there is something out there to objectify, something I can point you towards, something we can consider together and talk about.

What about philosophy? There appears to be a textual tradition going back to Plato and Aristotle that can be studied. Though I suspect philosophers prize at least the idea of freedom of inquiry too much to be explicitly tied down to any specific set of texts. One hears it suggested that philosophy is not limited to the study of a certain body of literature, but is a way of thinking about things imparted from teachers to students (much like Socrates was supposed to have imparted his wisdom). That may be the case. Such a definition only distracts from the omnipresent place the study of texts plays in philosophy departments or philosophical armchairs (whereupon the armchair philosopher sits).

At this point, in order to wrap up a blog post that is already much longer than I anticipated, I want to show my cards. I have soured towards the idea that separate academic disciplines (philosophy, theology, history, political theory, English literature, etc.) are as distinct from each other as many of our teachers have supposed. It seems to me that common too each of the so-called separate disciplines is the thinking human being, reflecting on some body of evidence. There is no thought without some object, as David Hume reminded his Cartesian interlocutors  at least none that I am ever aware. The theologian thinks, the historian thinks, the philosopher thinks, etc. They think differently, however, according to their different objects of inquiry.

And it seems to me, if philosophy is anything, it is reflecting on (or thinking about) how we think about things. Full stop. The definition of philosophy needs to be made with reference to the human being who thinks about things, and not some set of abstract definitions. Not, say, the love of wisdom apart from the person who loves wisdom. Not critical inquiry apart from the person who inquiries critically. Not a definition considered at an abstracted remove from the person considering the definition. Rather a person who can say to themselves, I am thinking about things, and that's what I normally do; and when I philosophize, I think about what it is to think about things.

Saturday, April 06, 2013

A/theism and Certainty

Patrol Magazine retweeted an article published last October on the modern history of A/theism. The two words theism and atheism are paired together, the article's argument goes, because modern theism cannot be understood apart from modern atheism, and vice versa. They 'emerged from the early modern world together, as two sides of the same coin', a claim which fits well with the portrait of modern culture painted by the intellectual authorities, including John Millbank (Theology and Social Theory) Charles Taylor (A Secular Age). The contest between theism and atheism in the modern age is presented by partisans as a zero-sum game. The winner must take all and the loser must be vanquished from the field.

The author of the article, Kenneth Shephard, notes a correspondence between late 20th century assessment of modern A/theism and 16th and 17th century attempts to cover the same intellectual ground. For so many of the persons involved in the discussion, theism and atheism go together like transcendent and immanent, each term in these pairing excluding the other, but also presupposing the existence of the other in their own need to exclude something. Atheism needs theism like science needs the straw-man of religion to knock down. Theism needs atheism like good needs something evil to vilify. In this sense, they are like children behaving badly.

Sheppard situates A/theism in larger 'processes of disenchantment, desacralization, and secularization'. Instead of seeing theism and atheism as opposed over matters of religion and science, the better thing to do is observe how theists and atheists make sense of the world as the language of scientific discovery drives fantastical claims from the public square. Instead of demonizing one from the vantage of its opposite, pause and take note of those cultural trends they commonly presuppose. The two sides may talk as if they share nothing in common. Historians like Sheppard, however, know better than to buy into their self-assertive, but partial, ideological perspectives. Where there exists contiguity in space and contemporaneity in time, ideologues are shown to be liars of the first order. All the talk in the world cannot hide the fact that some cultural currency is shared in common.

The analytic framework proposed is a helpful move in the right direction. Once one stops trying to measure the perspective of one's opponents against the measuring stick of History (with a captial 'H"), it should become a whole lot easier to have a conversation--in principle, at least. When the political left inclined towards atheism and political right inclined towards some variety of theism are divided from each other as past is from future, there is very little reason to talk. Conservatives are stuck in the past say the progressives, and progressives have forgotten the past say the conservatives. The measuring stick of History tends to distract from obvious truths: that all of our business with each other is transacted in that shared moment the past is no more and the future not yet called the present.

The purpose of the article, if I have understood correctly, was to do what is termed in very post-modern language creating space for dialogue where 'traditional religious believers, “nones”, and atheists can relate to and work with one another in spite of what can seem like our insurmountable differences.' This is all well and good, and I am all for having a friendly conversation on a level playing field. But the article's argument seems to thrust readers in the direction of abandoning their idols, all those things they hold dear, without actually interrogating why we hold onto our idols with the tenacity that we do.

Sheppard speaks very generally about historical processes, and very little about historical actors. That is a problem, it seems to me, because I have never encountered one operating apart from the other. He speaks very generally about what we believe about the nature of God and the world, and very little about what we have thought about ourselves.

If there is one thing that sets modern A/theism apart from its premodern manifestations, in my estimation, it's an ideal of certitude shared by all alike. The modern atheist rest assured that there is no God because the evidence is lacking, while the modern theist does the same because that's what the Scriptures say. The sorts of evidence to which appeal is made changes, but the constancy of conviction does not. The origins of the certitude might be traced to such luminaries as Martin Luther ('Unless I am convinced by the testimony of the Scriptures or by clear reason...Here I stand. I can do no other.') and Rene Descartes ('This proposition, I think, therefore I am, is the first and the most certain which presents itself to whoever conducts his thoughts in order.'). The exposure of the baseness of all these simplistic appeals to certitude, e.g. in the work of Nietzsche and his post-structuralist disciples, might also be cited, though as proof of just how deep our certainty runs, now that we have become certain of our uncertainty.

So I will take my departure from Sheppard where he suggests we tell 'critical stories' about the 'conditions of our belief'. (Why not build a campfire and bring some guitars?) That suggestion sounds like an exercise in talking around the issue, which has instead to do with whether and in what sense we are certain.

Thursday, April 04, 2013

Muller on Thought and Language

The Ancient Greeks used the word 'logos' to symbolize two sorts of things today we usually keep separate: on the one hand thoughts, and on the other hand spoken words. Not even written words (like these words on the screen in front of you) were regarded as highly as spoken words. Only the spoken word carried the immediate force of a persons thoughts. They carried the force of a person`s soul, their purpose, even their life. Words on the page were dead letters, hollow reminders of things once spoken.

Reading through Friedrich Max Muller's lectures on Natural Religion, just how far our intellectual convictions in the 21st century have wandered from Ancient preoccupations was impressed upon me. More recent figures like Thomas Hobbes and John Locke could have still carried on an agreeable conversation with the Ancients about things that follow as a consequence of the intimate relationship between spoken words and thoughts. Intuitively, I think, we should also be able to recognize what they are talking about. We each have our own 'internal monologue' by which we think through ideas in the form of a more or less broken conversation with ourselves. (Please tell me I am not the only one!) But we don't place the same sort of theoretical value on the distinction between our internal monologue with ourselves and an external dialogue with other people (or with yourself, though that usually attracts the concerned attention of other people.)

Muller establishes, fairly persuasively in my estimation, that no human being thinks without words. Our knowledge of language comes out of processes of socialization, especially early on in life. Knowledge of language allows for the communication of desire, purpose, or query. All of those 'higher cognitive functions' seem to depend on a mastery of language. Now that is not to say that other animals do not cognize and communicate. But what they lack, Muller thinks, is the ability to abstract and categorize, analyze and synthesize--specifically those things that have allowed human beings to cultivate the ground, transform the natural world, build up a civilization, and write books and blogs about it, wondering what it is to be a being that has words--logoi.

The conclusion he eventually puts to his readers is still manages to be something of an eye-opener.
The reason why real thought is impossible without language is very simple. What we call language is not, as is commonly supposed, thought plus sound, but what we call thought is really language minus sound. That is to say, when we are once in possession of language, we may hum our words, or remember them in perfect silence, as we remember a piece of music without a single vibration of our vocal chords...But as little as we can reckon without actual or disguised numerals, can we reason without actual or disguised words.
The first part of Muller's observation is strange enough on its own. It never dawned on me to ask myself whether language was thought plus sound or thought was language minus sound. The comparison itself is intelligible enough. I have thoughts, and you can't hear them unless I speak my thoughts, at which point my thought become audible words. But I never thought the difference might be theoretically productive.

Muller's decision against defining language as thought plus sound in favour of thought as language minus sound is even more perplexing. (Hence I am blogging about it.) The decision corresponds well with the above noted observation that our knowledge of language--and our ability to think--comes through processes of socialization. We don't just make up our own words. Someone, usually parents, teaches us how to use them. The decision also conceptualizes words as objects of study. They are cast as things that we can both look at and think about, and then have a conversation about. They are perceptible objects, ultimately not reducible to the interpretive whims of persons.

But does Muller's account make sense of our individual experience using words? When he says thought is language minus sound, he seems to suggest that the language we use does our thinking for us. And, no doubt, there is something to this. If people spend enough time together, talking to each other, they end up thinking more or less on the same lines. We tend to listen to and read things that confirm our sense of the world around us.

I have to wonder, though. I personally have had a not infrequent experience of lacking the right words to express my intention. The words don't correspond quite right to an objective states of affairs, and so I find myself unable to communicate my meaning. The result is that the logoi in my head don't always seem to match up with the logoi in someone else head. The only thing to be done is root around in my head for better words or better ways of stringing words together.

The Ancient Greek idea of logos is able to make sense of this situation. It locates intelligence both inside and outside a person's head, but doesn't require that the correspondence between them be completely transparent. About Muller's conception of language, I am not so sure. If thought is really just language minus sound, if the correspondence really is transparent, one has to wonder who is doing the thinking.

Wednesday, April 03, 2013

Dying with Iain (M.) Banks

The Scottish science fiction writer Iain Banks announced to the world today that he probably only has a few months left to live. Diagnosed with gall bladder cancer a couple of weeks ago, Banks has put his feverish rate of literary output on hold indefinitely, asked his partner of many years if she would do him 'the honour of becoming my widow', and plans to spend the remainder of his days visiting with family, friends, and locations that hold personal meaning. He is not yet decided whether he will pursue chemotherapy treatment to extend briefly what time remains to him.

Banks breathed new life into the high art of hard science fiction, which had known such masters as Isaac Asimov and Arthur C. Clarke, with a series of Culture novels. The better examplars of the genre are defined by a certain cosmic gimmick, setting the stage on which the plot line unfolds. For Asimov's Foundation Trilogy, for example, the discipline of psychohistory, developed by the patriarchal character Hari Seldon, promised to unlock the key to social development. Seldon predicted the decline of the Galactic Empire, and laid foundations for a much more durable successor. The predictive failure of psychohistory to account for an enigmatic figure known as the Mule, a sort of galactic Napoleon, drives the plots of the second and third parts of the trilogy.

The cosmic gimmick driving Bank's Culture novels does not allow for quite so much human participation. The Culture novels form a collection of more or less disconnected narratives set in the same universe. The Culture is a vast civilization governed over by massive artificial intelligences, who keep a human population sprawling across planets, airspheres orbital platforms, shellworlds, and ships spread across a large portion of several galaxies (if my memory serves me correctly). The narratives play out in the vast distance between the finite human mind and, what are for all intents and purposes, practically infinite Minds. Banks has a gift for imagining vast intelligences whose experience of space and time is utterly dissimilar from human perception.

The Culture is a 'post-scarcity' society, in which no citizen lacks for their basic needs. Surrendering the government of human society to the Minds, removing human avarice, error, and whim from the political equation, meant that material equilibrium in society was now possible. Money and personal possessions no longer exist, though material prosperity still allows for the cultivation of privacy. There is a moral seriousness to Bank's storytelling. He doesn't shy away from explore the fiber of a society that has grown fat, complacent, playfully irresponsible, and whose personal bonds are reinforced by an artificial structure. At the same time, the Culture narratives seems to play out like an internal monologue in Banks own head as he explores the logic of his atheist convictions. Many of his characters regard their own existence with a sort of bemused shrug one can well imagine their author shares. A touch of the great stoic Scotsman David Hume exists in Banks--and there would be more, if he weren't so damned Hegelian.

I started reading Banks' work about six years ago, around the same I picked up George R.R. Martin's Game of Thrones series. It was his ability to expound on philosophical themes in novel form that prompted me to read as much as his work as I had the time the to spare. Like so many other science fiction authors, Banks rethinks divine transcendence in terms of a future state of affairs, rather than an eternal present, which is the same everywhere, past, present, and future. Divinity, though still exceedingly powerful, is placed under spatio-temporal constraints. In the case of Bank's Minds, they emerge from the depths of human creativity, achieve independent sentience, and are let loose to care for their creators. Granted this only seems like a different form of servitude; but the Minds, particularly the ship-based Minds, seem to take it all in stride and dry humour.

Science fiction writers are usually at their best mocking the old ideas of God and domesticating it to their purposes. I say usually because I am not sure that someone like Robert J. Sawyer actually knows how to do anything more than preach to an atheist choir. Bank's literary engagements succeed, to my mind, on account of his willingness to acknowledge that dethroning the old gods does not eliminate the existential questions for which the old gods provided answers.

Not wanting to sound insensitive, I will be curious to watch the moment when the pen which Banks uses to write this final chapter in his life finally falls from his hands and is taken up by an increasingly vocal atheist elite. Banks' life is likely to be eulogized, his self-sufficient hold on existence, his lust and zest for life, held up as an example for atheists everywhere, much like late Christopher Hitchens' life has been celebrated.

Hagiography is a double-edged sword. When you extol the virtues of mere mortals, they usually end up appearing more mortal and less virtuous. It has very little to do with the person being eulogized, in any case, and more about what s/he has meant or continues to mean for we who live on. But perhaps it best not to speed Banks along his way just yet by thinking on what might be. Some time still remains. And the publication date for one final book has been moved up.

Tuesday, April 02, 2013

They Knew Not Dawkins

The Book of Exodus begins with these ominous words: 'Now a new king arose over Egypt, who did not know Joseph.' The words signal the beginning of a new chapter in the history of the people of Israel. Joseph had been one of the twelve sons of Jacob (also named Israel). His prudent government had saved Egypt, and by extension the people of Israel, who moved down to Egypt, from seven years of famine. But at the beginning of Exodus, we find Jacob's children in slavery to a king who knew not Joseph.

Now I don't want to make to much of the analogy I am going to draw with the opening lines of Exodus. At least, nothing quite so ominous. But it seems, in the course of four short years of teaching in an Introduction to World Religions class, I have witnessed the coming of a generation of students who do not know Richard Dawkins. The realization caught me by surprise, and got me thinking about what sort of cultural groundswell might be occurring.

Lecturing on Judaism four years ago, I used Dawkins' dismissive reading of the first chapter of Genesis as a counterpoint to what an original reader might have taken away from the text. I countered the Dawkins-take on so-called six-day creationist interpretation by observing that the only real expectation original readers probably took away from the text was that a week had seven days. Not five days, or six days, nor eight days, or ten days. Seven days: six on which people work, like God worked, and one of which they rest, exactly like God rested. Certainly nothing about the scientifically-verifiable beginnings of the universe, which was more or less meaningless at the time. My evidence? That's what is says the original readers were supposed to take away from the Genesis narrative in Exodus 20, otherwise known as the Ten Commandments. Before I gave my pious spiel, however, I asked how many people in the audience knew who Dawkins was. Out of a class of 50 or 60, a full third raised their hands.

When I asked the same question this semester, not one person in 50 raised their hand. A bit taken aback, I think I sputtered through an explanation of who Dawkins was and why he was significant. The life of an atheist, of course, is not exactly on topic in a world religions class. Tying the now irrelevant reference to Dawkins into a short discussion of the shortcoming of six-day creationism, I managed not to look too much the fool.

It seems today students of 18, 19, or 20 years of age did not know Dawkins. As far as popular intellectual discourse goes, it seems like Dawkins is all I've ever known. What changed over such a short period of time? In the broader scheme of things, popular culture has most likely chewed Dawkins up and spit him out. Which will happen to most everyone who courts the public eye for too long. With his one message about the delusions of faith, Dawkins was bound to be effective in short term, but would tire audiences out over the long term.

More tellingly, perhaps, the composition of the world religions class has also changed. My general impression is that the number of Caucasian students has declined in proportion to other ethnic demographics--especially Middle Eastern and Indian.

So the reasons why the popular discourse has shifted away from Dawkins' anti-religious messages may run deeper than mere generational shifts. For the first time in my life, many students come to religious studies are largely innocent of the lengthy 19th and 20th century arguments stemming from the Enlightenment tradition of against religion--specifically Christianity, to be sure, but religion more generally. I have caught myself a number of times using Western atheism as a foil in conversation with student, for example, in comparison to Buddhism or Hinduism, or in comparison to the charge of atheism brought against early Christians. Students are always polite, but I have left conversations wondering whether alluding to Western atheism was the best way to illustrate a point.

Prospects for the future are interesting. My intellectual battles, the intellectual battles of my teachers, and their teachers, and their teacher's teachers, may be nothing more than an antiquarian curiosity to the next generation of students. That's strange to think about. The Christian sub-culture in which I grew up and was educated defined itself over against a secular world, which was in its turn defined by nominal professions of faith and outright skepticism. The Christian sub-culture was animated by the myth of a lost Christendom, a place from which we came and back to which the faithful would have to bring the country, kicking a screaming if necessary. Growing immigrant communities, however, don't carry the same chip on the cultural shoulder that Christian communities do. They do not necessarily have the same suspicion of the secular public square, nor do they see the sort of tensions between religious life and public life that domestic Christian communities have internalized.

Perhaps I shouldn't underestimate the ability of Western academia to thoroughly inoculate the second of third generations in immigrant families against religious beliefs. At the same time, I fairly confident Christian communities will not abandon their suspicion of the secular public square any time soon. These are intellectual potentialities. What about the demographic numbers? Given the growing numbers of immigrants, it is becoming increasingly unlikely that immigrant communities are simply going to choose between one of Caucasian two solitudes.

I wonder, therefore, how debate about the nature of secularity is also going to change in the next few decades. For the last 40 or so years, the North American arm of the debate has increasingly been couched in winner-take-all terms. The character of the debate changes if there's more than one major religious participant (or two, if you count Judaism; or three, if you count Mormonism). The character changes significantly if one of the new participants is not a native Western European tradition.

Monday, April 01, 2013

Evangelicalism and Science

For the sake of clarity, though the title may allude questions about the relationship between something called religion and something called science, I should point out my intention is not to commit cognitive suicide. Sometime in the middle of the 20th century, academics, including not a few Evangelicals, decided it would be a good idea to pretend religion and science were discrete things. Ian J. Barbour's taxonomic, historical analysis of these supposedly discrete 'things' is an especially famous example. Those who followed his line of thinking bought into a fundamental conceptual error, which would have beggared belief among pre-modern and early-modern thinkers. No longer did the human being have thoughts in their head, but thoughts had human beings through whom they expressed themselves on the historical stage. Religion did this and that, these taxonomists said, while Science did the other. In short, they commit cognitive suicide.

So, for the sake of clarity, what here is meant by 'Evangelicalism' is how self-identified Evangelicals have understood and continue to understand themselves in relation to a world full of other persons and a myriad number of other things. More to the point, what is meant by 'Evangelicalism and science' is how self-identified Evangelicals have thought about the methodological assumptions of contemporary science. Neither Evangelicalism nor science possess a thing-like qualities. Rather they are terms employed to designate ways of thinking about the human experience of things in the world. A self-identified Evangelical, on the other hand, does possess a think-like quality; though the capacity for self-definition means, despite evidence to the contrary, Evangelicals are thinking things--or what classical thinkers termed rational animals.


The relationship between Evangelicalism and science is best described as a highly-selective exercise in self-justification. The results of any scientific process through which a research question is formulated and a hypothesis is developed will be highly contingent upon the personal motivations, values, outlook on life, and so on. In the case of Evangelicals, however, there occurs a conflation of moral and scientific reasoning, which reduces both scientific study and moral judgment to mere exercises in self-assertion.

The paradigmatic example of how Evangelicals have understood the relations between moral judgments and the methods of scientific study is quite naturally creationism, as well as its intellectual progeny, scientific creationism and now intelligent design. Original six-day creationism was formulated as a response to social-Darwinism. After Darwin had made men out of monkeys, the scientific elite began proposing all manner of social engineering programs for the betterment of the human race. Social engineering was the province of totalitarian governments, who ran roughshod over the personal dignities and liberties of individuals. The moral deficiencies of social-Darwinism, in the minds of creationists, meant that the evolutionary accounts of humanity's natural history were therefore wrong. The idea of the human being created in the image of God provided a moral bulwark against the temptation towards social engineering, which meant that the natural history of humanity's origins related in the first chapters of Genesis were therefore right.

The prospects of original six-day creationism eroded steadily through the 20th century. The larger moral battle against social-Darwinism was largely won in the North America and Western Europe with the fall of Nazi Germany. Scientific creationism and now intelligent design theory both operate with a much restricted agenda. No longer possible to claim to be fighting on behalf of humanity, these are now regarded as markers of Evangelical identity, both inside and outside the community. The difference between the earlier creationism and its progeny turns mainly on its relationship to scientific methodology. Instead of rejecting the theory of evolution for its implicitly immoral conception of human nature, the tendency now has been to express moral truths in the language of scientific study.


My contention is that creationism is but one example of an Evangelical need to conflate moral and scientific reasoning--to the detriment of both moral and scientific reasoning. Because it presumes to pronounce upon matters related to the natural sciences, creationism comes across to most as quackery. With the social sciences, the Evangelical penchant for conflation comes across as a little more credible, no doubt because the object is not merely natural, but also moral. Lift the lid on a world of conservative, evangelical think-tanks in the United States just a little, and what you discover is a wide-reaching effort to marry old natural law arguments for this or that social configuration with the new methods of social scientific analysis. The result is an odd hybrid of 'outreach strategies" meant to bring in the unchurched while preaching to the choir.

My contention, further, is the need to conflate scientific claims and moral judgments are widespread. I am going to comment on two specific examples, both using the methods of social science, one to defend the normativity of heterosexual marriage, and the other as a justification for remaining in conservative churches. Please keep in mind that these patterns of thinking may be found beyond the case studies offered below.


A junior editor over at First Things, Ryan T. Anderson has published 'Marriage: What It Is, Why It Matters, and the Consequences of Redefining It. A position paper outlining the natural law argument for why governments should keep their hands off an social institution whose existence 'precedes' or 'predates' government, one needs to look in the bibliography to find actual sources of empirical evidence. The links provided are not exactly plentiful, but a little patience yields reward. Marriage and the Public Good, published by the Witherspoon Institute, dedicates almost half of its space to the discussion of actual scientific studies of familial relations, including an attempt to measure the physical presence of fathers on the development of children. Another study by the Social Trends Institute suggests that healthy economies owe something to marriage and fertility rates. Still another study, led by the Institute for American Values, looks at the cost of single-parent and broken homes to the taxpayer. It's immediately obvious the some of these studies are credible, while others leave a person scratching their head.

The plausibility of any one particular study is neither here nor there. Rather, the basic strategy of statistical correlation that everyone of these studies presumes draws my attention. The form of the statistical argument in social scientific study looks like this: between dataset X and dataset Y there is significant evidence for correlation Z, from which we conclude... The basic strategy is very much like moving puzzle pieces around on a board trying to figure out which one's fit with each other. There is an inescapable element of arbitrariness and contingency in the process of selecting one dataset, or one set of phenomenon, instead of another. The social scientist has to be very aware of the experimental constraints under which their dataset is collected. Those constraints have to be factored into any conclusion drawn.

The extremely limited purview of a social scientific claim makes empirical generalization difficult. Any generalization is open to an infinite number of qualifications. The data may indeed show it to be a good idea (which is defined empirically as the possession of something or the achievement of some state of mind or well-being) for a child to grow up with their father--expecting those situations where the father is abusive, an alcoholic, pedophile, sadist, and so on. An infinitely qualifiable, general empirical claim is not yet an normative moral judgment. The later sort of judgment has the power of organizing the infinite number of empirical qualifications into categories of good and bad, better or worse. But normative moral judgments are not necessarily sensitive to the complexities of a situation--that is, unless the person passing a moral judgment is sensitive to their own inability to process every possible piece of relevant data. A moral 'ought' (value) is can never be extracted from an empirical 'is', no matter how suggestive the empirical 'is' appears. Facts don't appear like anything in particular until they are interpreted by persons who see them as something.


The second example has to do with Evangelical self-assessment. The more conservative strains of Christianity have proved, especially in the last few decades, much more resilient to the tides of cultural change than liberal and mainline Protestant instances of the faith. Not a few Evangelicals have turned statistical tools furnished by the social sciences on the study of their own traditions and concluded that they must be doing something right. In fact, it was a liberal Protestant by the name of Dean Kelley who brought this to the attention of American readers. The question was again bandied about on the internet a couple of years ago, in response to an opinion piece in the New York Times reflecting on the virtues of inflexibility doctrinal versions of faith.

More seriously minded evangelicals will no doubt want to distance themselves from the cheap panacea of health and wealth preaching, which says, if we do A, B, and C, then God will reward us in very tangible ways. But the subtle allure of a statistical study, which discerns God's hand in absolute numbers and percentage increases, plays on the same sort of need to reduce divine purpose to very tangible manifestations. The only difference is one has the air of respectability because it appeals to certain scientific standards. There is also an irony implicit in conservative Christian appeals to raw data to demonstrate the superiority of their principled version of the faith over that of liberal and mainline churches. What killed liberal Christianity, the story goes, was a watered-down social gospel. grounded in late 19th century sociological theory. If sociology always leads to bad theology, just empirical conclusions should never be confused with moral judgments, what business do Evangelicals, on the terms of their own confession, have framing their arguments in the contemporary methods of social scientific study?


It is more or less true that reality has a liberal bias. By refusing to search for moral significance in every piece of empirical data, the liberal mindset is much more in tune with the contingent and conditional conclusions of the natural and social sciences. There are, of course, liberally-minded individuals who also behave as if certain scientific conclusions obviously lead to certain moral conclusions. But those moral conclusions usually presume something about the individual's right to self-determination, which conforms to the contingent and conditional character of scientific conclusions. The conservative Evangelical mind still assumes the natural order is fundamentally Aristotelian, invested with teleological purpose, orienting all things towards their transcendent end, foreshortening the distance between fact and value. There would be no problem if this were the 16th century. But the in the 21st century, our basic assumptions about scientific investigation of the natural order have fundamentally changed.

Perhaps more troubling is that the attempted marriage of natural law theory and the methods of social scientific study has the potential to cut the heart out of the longstanding Western moral and legal tradition. The conviction essential to the rule of law that one has to distinguish between the actions and nature of a person, allowing for the presumption of innocence until guilt is proven. If the only way one knows how to think about human beings is to quantify human action, prior assumptions about the nature of the human being--e.g. as a thinking thing or rational animal which is deserving of the dignity any being sharing in that nature deserves--cease to hold sway over our moral imagination.