Monday, December 31, 2012

Lawyer Thinking vs. Scientist Thinking

Over at Cross Examined, hosted on Patheos, Bob Seidensticker distinguishes between two modes of thinking: open-ended scientific inquiry into the truth of the matter and a lawyer's biased defense of a perspective on the truth of the matter. The blog post is a response to an Christian apologist's critique of an earlier attempt made to clarify the distinction between the two sorts of thinking. Bob's disdain for lawyer thinking is palpable; to be fair, he thinks it appropriate to the court setting room, though he doesn't seem to think it appropriate anywhere else.

This blog caught my attention because Bob does not go out of his way to stick it to the religious nutcases. If anything, it is a more general account of how the truth is known. There will be implications for a 'scientific' perspective and a 'religious' perspective. Bob associates scientist thinking with being open-minded, and lawyer thinking with being close-minded. It won't come as a surprise to Bob, then, that the legal language permeates the Hebrew and Christian Scriptures, since he thinks religious people close-minded. The relationship between God and creation is defined in terms of covenant, sinners are covenant breakers, whom God presses charges against in a heavenly court of law, and the function of the priestly caste to make restitution for blame-worthy actions.

The old debate about religion versus science is not the point of Bob's blog, even though Richard Dawkins does make a celebrity appearance. The actual point of contention concerns whether historical study constitutes an example of science thinking or lawyer thinking. Bob makes a rather high-minded (or, dare I say, Pollyanna-ish) assertion that historians 'use' science thinking rather than lawyer thinking. And I have to wonder what sorts of evidence Bob draws on to make this assertion. Has he read a couple of history books? Has he watched some professorial lectures online? Maybe he took a few history courses in college or university. Better yet, has he set through an academic conference debating the pros and cons of this or that point of interest? Or, even better yet, has he studied historiography, or the history of historiography?

Bob will have to forgive me for coming across as a bit glib. I have done all these things. His insistence that historians user science thinking, not lawyer thinking, doesn't ring true--and not merely because of the laughably simplistic definitions that Bob provides. One wonders, for example, whether Bob ever read and took to heart Thomas Kuhn's The Structure of Scientific Revolutions. If he had, he would be aware that even scientific paradigms are inherently prejudiced, that there is no single standard of scientific rationality, and that different scientific disciplines operate with different criteria of truth in mind. The idea that any human endeavor could be entirely open-minded--'an egoless and collaborative search for the truth by following the facts where they lead'--is utopian and has very little to do with actual scientific study, i.e. its the sort of thinking with which religious persons have an intimate acquaintance. Collaboration and consensus have nice rings to them; but they are ideals, things we aim at and hope for, never realized in actuality.

The methods of scientific study do contribute to the study of human history, but they play second fiddle the methods of humanistic studies, which end up looking a lot like Bob's scanty description of lawyer thinking. What distinguishes the study of human history from the study of the natural sciences is the fact that principle body of evidence of which historians make use is created or generated by other human beings. Texts, buildings, coins, clothing, and so on, fit into this category. Rocks and trees don't communicate with human beings like human beings can and do communicate with other human beings. There is, after all, a fundamental difference naturally occurring rocks and trees and the rocks and trees with the word 'Hello' etched or carved into their surface. The latter has become a medium of human communication. This is a very simple truism, and Bob seems to have missed its importance. The study of human thought and action are subject to wholly different standards than the study of the natural world. The study of human thought and action is a self-referential study--human beings studying other human beings--whereas the study of rocks and trees is not. And, as it is human, it is not egoless.

As soon as questions about what another person meant by saying or doing this or that come to the fore, what Bob calls lawyer thinking takes precedence over what he calls scientist thinking. Why? Bob also misunderstands what goes on in a courtroom. In fact, the antagonism of a courtroom is highly consensual and collaborative. Everyone knows the roles prepared for them to play. Truth is supposed to emerge from a process of investigation and interrogation. There are rules that must be followed if the desired outcome is to be achieved. That desired outcomes serves the general social consensus on legal form of the political community. The process has been established to assess evidence that is generated by human being. Historians engage in a similar process of ordered antagonism when evaluating evidence their human-generated evidence. Different historians make their partial offerings to the study of a particular topic. Each definitive study can be taken to be a single ruling on a case. But the law and its interpretation, like human history and its interpretation, evolves over time. Old precedents are over-turned; new precedents are established.

My suggestion to Bob is that he ought to think less about different sorts of thinking and more about the different sorts of things about which human beings think--like the difference between rocks and trees, on the one hand, and texts and buildings, on the other.

Sunday, December 30, 2012

The Supposed Prospects of Online Education

People are beginning to pay attention to an apparent shift from traditional forms of higher education, as  offered by the modern research university, towards online dissemination of intellectual materials. The most progressive among us seem to think that the university campus, and the high rates of tuition that come along with it, are on the way out. The idea middle class teenagers would leave home, get an education, and start a new career, new life, and possibly a new family, may not last long in our cultural consciousness.

Most attempts to make sense of whether the modern research university will decline in the face of the proliferation of online opportunities rest on what I believe are faulty assumptions. The university is thought merely to offer a product for consumers. (See the Economist or Slate.) People want information in order to advance themselves professionally, to develop a new set of skills, or purely to satisfy curiosity. The internet makes that information readily available to anyone who wants to pay a price, one that is much smaller than traditional tuition fees. But, in fact, the modern research institution has never been exclusively an institutional context for the generation and dissemination of knowledge. It also serves the function of accrediting students for entrance into a wide range of professions. It establishes a baseline against which non-academic businesses can judge the suitability of a job candidate. If you don't have a degree, the mantra goes, don't bother apply for the job.

The internet has long been lauded for making possible the democratization of knowledge. The possibility of learning anything, doing anything, and making anything of yourself is held out for dreamers to take hold. But a true democratization of knowledge and human potential is not possible. Even Wikipedia is not without a system of quality control. There will always exist communally-sanctioned processes whereby individuals are vetted, accredited, and steered towards a limited number of vocational options. The modern research university has established itself to fit just that role.

If online course offerings are ever going to meaningfully challenge the modern university, a number things will need to happen. The most obvious is that an online university will have to show that it can maintain the academic standards of a traditional university, if the forum is to be credible. It will also have to demonstrate that it can maintain the interests of students for periods of time long enough to complete a degree. Traditional university settings were able to encourage this through immersion. Someone who enrolled in an online institution would do so presumably because they had other time commitments. Certainly their is a market for this type of program, but it can hardly be the standard model. Perhaps the immersive experience can be achieved through Second Life or similar form of virtual interaction.

Let us not jump to any hasty conclusions about the demise of traditional forms of university education. Thus far, websites like Coursera and Udacity can only augment and enhance traditional forms of education.

Saturday, December 29, 2012

The Human Face in Les Miserables

Over at Philosophical Fragments, hosted on Patheos, Timothy Dalrymple takes a stab at the theology of Les Miserables. Now, I don't know how THE theology of Les Mis could be summed up in so few words; Tim's attempt makes short shrift, both of theology and the sorts of questions the latest offering of Les Mis has been used to explore.

The latest extrapolation to the big screen starring Hugh Jackman, Russell Crow, Anne Hathaway, and a few others of smaller repute like the comically serious Sasha Baron Cohen, caught me off guard. Familiarity with Victor Hugo's classic study of virtue and vice, joy and misery, fall and redemption in the period of revolutionary France convinced me of the theological potential for any cinematic adaptation. A number of years ago, I also saw a stage production of Les Mis in Toronto. The motifs of revolution were displayed across the stage; the death of tyrants by the will of the people, and so on. What I did not remember was the ubiquitous presence of the crucifix and sighs for divine salvation in every second song. The absence of visual effects on stage may have contributed to my inability to hear the otherwise pious content of the musical's songs.

So you must try and imagine my surprise to hear the songs anew. Hollywood seems to know only the rule of an eye for an eye. Repay violence with violence, and deal death to those who stand in your way. Hollywood doesn't seem to understand that conflict can be resolved by an act of charity. Watch Peter Jackson's latest The Hobbit to see just how difficult it is for contemporary filmmakers to negotiate between these two basic patterns of narrative resolution. Biblo may spare Gollum's life, certainly, but a thousand more like him are cut down without a second thought.

The theology of Les Mis, on Tim's account, can be encapsulated in a character study of Javert and Valjean. Each in their own ways, both are men of God. Javert loves God for his law, the source of orderliness in the world; while Valjean  loves God for his grace and mercy. Or you could say that Javert loves God the Creator and Valjean, God the Redeemer, exemplified especially by a voluntary act of mercy. As Creator, God is abstract, distant; but as redeemer, he is personal, ready-to-hand. And, it goes without saying, this division is just a bit too neat and tidy, a bit to narrow in scope, as well. Hugo would have known that one does not exclude the other; that God can be both, even if the human mind will not very easily allow that these two images can make a single divine visage. It is almost inconceivable that the rest of the cast of characters are no more than asides in the dialogue of a schizophrenic God with himself.

The interest the film shows in the human face is abrasive. Over on Slate, Dana Stevens comments, 'few performers can sing vocally demanding, dramatic solos while a movie camera inspects their nostrils.' Indeed, the interest is so abrasive you do not have to look far to find some film critic complaining about the director's decision to spend the better portion of the film with the camera looking closely at someone's face. But the human face is the point, and let the critics who do not, be damned for not seeing as much.

Alone in the world of objects--of things like rivers, rocks, and trees, of plants, animals, or even parts of things like the human hand or torso--the human face is lit up with intelligence. Not merely potentially intelligible, it actively communicates intelligibility. And that shines through all the blood, mud, and sweat that may cover the face.

The human face demands response, even if the only response offered is a refusal to respond. The imperative is found not only in Javert or Valjean's faces, but in every face. It is not reducible to any one person; it is shared by every person, regardless of their station in life, which is exactly what is meant when the human being is said to be created in the image of God.

Thursday, December 27, 2012

The Death of Leaders

Christmas this year saw both George H.W. Bush and Nelson Mandela in hospital to treat the health complications that accompany old age. Eighty-eight and ninety-four years of age, respectively, Bush has come down with a troublesome fever and Madiba contracted a lung infection. None of the new sources of which I am aware have done much more than comment on the immediate causes of hospitalization. The fact that these sorts of things are to be expected as a person gets older is left out.

We are no longer supposed to believe in great men, who, according to some exceptional endowment, stand apart from the mass of humanity. For all intents and purposes, however, we still behave as if we do. The attention lavished on Mandela is not that hard to comprehend. His very public role in the downfall in apartheid in South Africa, in the defense of common human decency and dignity, is readily rewarded by placing in a special category of human being. The very public function of the U.S. Presidency also sets Bush apart, even if it is not as apparent why this should be the case. It seems, deep down in our bones, we will retain a sense of the divine sovereignty in the forms of authority with which we are familiar.

Our leaders must be deathless--that is, until they die, at which point we reflect on our own frail humanity reflected in the death of a leader. This is the only reason I can discern for why the most perceptive of reporters and news anchors fail to perceive the reality of death, fail to comment on it, fail to remind us of ubiquitous nature of the Great Equalizer. The best they seem to be able to do is express horror at the thought our leaders may one day go the way of all flesh, how sad their families will be, how sad we will will be to no longer be blessed by their munificent presence.