Tuesday, October 31, 2006

Eric Alterman - "And how about a little noise about Salah Choudhury?"

Eric Alterman gets right to the point here. Do we believe in freedom of expression or don't we? If we do, then "how about a little noise about Salah Choudhury?"

(For some further information on this case, see here & here & here.)

--Jeff Weintraub
====================
Altercation by Eric Alterman
October 30, 2006
And how about a little noise about Salah Choudhury?

The mainstream media seems to be ignoring the case and leaving it to the right-wingers and Jewish-oriented publications. What's up with that? I am so sick of being sensitive to Muslim and other sensibilities that deny peoples' right to free speech because what you say might offend them. Life is offensive, OK? I'm offended every time George Bush or Dick Cheney opens their respective mouths, but I deal with it. If liberals believe in anything, we believe in the Enlightenment, and tough luck on those who don't. Sign this petition. Send it around. (Thanks to Jeff Weintraub.)

"Two faces of persecution" - Salah Choudhury & Maher Arar (Terry Glavin)

Without support, Bangladeshi Salah Uddin Shoaib Choudhury (left) risks prison or death—a far cry from CanWest’s ordeal over coverage of Maher Arar’s deportation.
The journalist & blogger Terry Glavin, writing from Canada, reminds us of some concrete reasons why freedom of the press in particular and freedom of expression in general are principles worth defending--but he also reminds us that they mean very little in practice unless people who enjoy freedom of expression are willing to use it to expose and protest against repression and injustice. Here's the heart of his piece:

--------------------
Salah Uddin Shoaib Choudhury is a Bangladeshi journalist. [JW: see here & here & here] He’s the editor of the weekly newspaper the Blitz. You’ve probably never heard of him. Even in Dacca, the only journalism he’s really known for is a thin portfolio of essays that counsel peaceful coexistence between Muslims, Christians, and Jews.

Juliet O’Neill is a Canadian journalist. I’m sure you’ve heard of her. She’s a seasoned reporter with the Ottawa Citizen. She’s most famous for having had her home raided two years ago by RCMP officers who confiscated notes, files, and computer disks, hoping to discover the identity of certain high-level intelligence-agency sources she’d been relying on for some blockbuster front-page stories. [....]

Choudhury’s newspaper was bombed last summer. Early this month, extremist thugs raided and looted the newspaper offices and Choudhury was beaten and robbed. Denied police protection, Choudhury went into hiding. On October 12, he emerged long enough to appear in court on charges that carry a sentence of up to 30 years imprisonment or death.

A week after Choudhury’s court appearance in Dacca, the Ontario Superior Court handed down a decision that sided with O’Neill in a court challenge her newspaper had waged against the constitutional legitimacy of the RCMP raid on O’Neill’s house. The Ottawa Citizen heralded its court victory with a headline taken directly from the reaction uttered by David Asper, the vice president of CanWest Global, which owns the Citizen and dozens of other major Canadian media properties: “The Brute Force of the State Met Its Match”.

[But the case of Juliet O'Neill and the Ottowa Citizen is not the right analogy for Choudhury's case. --JW] The only Canadian case that comes close to Choudhury’s suffering is the agony of Maher Arar, the Syrian-born Canadian engineer whose story looms over everything the O’Neill court case was about.

After certain RCMP officials provided American authorities with false information implicating Arar in terrorist activity, Arar was picked up while changing planes in New York in 2002. He was sent on to Damascus, where he was brutally tortured and kept in solitary confinement for much of the year he spent in prison there.

Choudhury’s troubles began around the same time. His brave work started getting him noticed outside Bangladesh, and he was invited to speak at a Hebrew Writers’ Association conference in Israel in 2003. He was arrested at the airport in Dacca before he could board his plane. He was jailed on charges of sedition and espionage, beaten, tortured, and kept in solitary confinement for 16 months.

It was only after an international public outcry—mainly in the United States, where he was honoured with a PEN USA Freedom to Write Award [JW: see here]—that Choudhury was released. But the phony charges against him were recently revived by a notorious Islamist judge, and it’s on those charges that Choudhury’s trial resumes next month.

It was also only after a public outcry, here in Canada, that Syrian authorities agreed to let Arar come home to Canada, in 2003. But rogue elements within the RCMP, hoping to cover their tracks in the case, immediately revived their smear campaign against Arar. They fed O’Neill and other journalists the most vicious and outrageous lies about him. Finding out who those rogues were was the main reason the RCMP raided O’Neill’s house two years ago.

Last month, the commission of inquiry into the Arar affair—the analysis and recommendations section alone is 376 pages long—found that there never was a shred of evidence against him after all. No secret al-Qaeda code name, no time spent in Afghan desert training camps, no facilitation of terrorism- logistics work in and around Ottawa, nothing. None of those things you read about Arar in the newspapers was true. [....]

We all make mistakes. I don’t claim to be braver than Juliet O’Neill, and I make no charge of bad faith against her. The Georgia Straight is no braver than the Ottawa Citizen, either. We don’t need to be. Life is easy here. This is Canada. It’s not, say, Bangladesh, where 12 journalists have been murdered during the past four years, and where the dark shadow of Islamist extremism grows longer by the hour.

There is one small thing we might all do, though, to redeem the tawdriness of our vocation in this country, as an act of contrition for Maher Arar. Given its reach, CanWest Global could be particularly helpful in that one small thing.

It’s in the matter of a brave, 43-year-old journalist, with failing eyesight, who lives in fear for his life, as I write this, in Dacca, Bangladesh. The last time he was in trouble, it was an international public outcry that got him out of jail.

His name is Salah Uddin Shoaib Choudhury.
------------------

=> Canadians are not the only ones who should respond to this eloquent appeal. Choudhury's case ought to concern all of us. I put it this way in a recent post of my own (Freedom of the press under attack - Bangladeshi journalist Salah Choudhury faces the death penalty):
--------------------
This is not just a tale of woe, but also a call to action.Over the past decade there have been several significant cases involving the persecution, arrest, and/or or prosecution of writers and intellectuals where international attention has helped to avert, or at least moderate, unjust and repressive outcomes. [....] International response to these cases, and international solidarity with the victims, are obviously very important to help preserve some space for freedom of expression and to encourage possibilities for political liberty and political sanity.

The case of the outspoken Bangladeshi journalist Salah Uddin Shoaib Choudhury, now on trial facing a trumped-up charge of treason with a possible death penalty, is another important challenge of this sort. [....] He has faced years of persecution, including physical attacks and death threats as well as criminal prosecution, for his 'crimes' of criticizing Islamist radicalism and advocating reconciliation with Christians, Jews, and Israel. [....]

Choudhury was awarded the PEN-USA Freedom to Write Award in 2005, and his cause has been taken up by Writers in Prison Committee of International PEN. Their statement of October 10 [here] urges that everyone committed to freedom of expression should:

Send appeals to authorities:
- expressing serious concerns for the safety of journalist Salah Uddin Choudhury
- calling for him to be provided with immediate and effective police protection
- protesting the charges against Choudhury and calling for them to be dropped in accordance with Article 19 of the United Nations Universal Declaration of Human Rights [....]
However, this case is still not getting nearly the attention that it deserves. It seems to me that Choudhury's cause is one that all people who support the principles of political and intellectual freedom and who would like to defend possibilities for democracy, political sanity, and constructive international dialogue should be especially interested in taking up. And the reasons go beyond the obvious threat to freedom of the press and free expression that this case represents, though these should be sufficient. Journalists in the Muslim world who are willing to stick their necks out to take positions like Choudhury's are not entirely non-existent, but they're not very numerous either, and they take especially great risks when they do this. If they're going to get their necks cut off for it, then all of us will be losers. They deserve strong and principled support. [.....]

Therefore, along with International PEN and others, I strongly urge people to spread the word about this case and to write to the Bangladeshi government expressing their concern. [....]
--------------------

For further information and some relevant addresses, see HERE.

Yours for freedom of expression and democratic solidarity,
Jeff Weintraub
====================
Georgia Straight
October 26, 2006
Two faces of persecution
By Terry Glavin

Salah Uddin Shoaib Choudhury is a Bangladeshi journalist. He’s the editor of the weekly newspaper the Blitz. You’ve probably never heard of him. Even in Dacca, the only journalism he’s really known for is a thin portfolio of essays that counsel peaceful coexistence between Muslims, Christians, and Jews.

Juliet O’Neill is a Canadian journalist. I’m sure you’ve heard of her. She’s a seasoned reporter with the Ottawa Citizen. She’s most famous for having had her home raided two years ago by RCMP officers who confiscated notes, files, and computer disks, hoping to discover the identity of certain high-level intelligence-agency sources she’d been relying on for some blockbuster front-page stories.

If Choudhury and O’Neill were ever to find themselves competing for a bravery-in-journalism prize, O’Neill would lose. Hands down.

Choudhury’s newspaper was bombed last summer. Early this month, extremist thugs raided and looted the newspaper offices and Choudhury was beaten and robbed. Denied police protection, Choudhury went into hiding. On October 12, he emerged long enough to appear in court on charges that carry a sentence of up to 30 years imprisonment or death.

A week after Choudhury’s court appearance in Dacca, the Ontario Superior Court handed down a decision that sided with O’Neill in a court challenge her newspaper had waged against the constitutional legitimacy of the RCMP raid on O’Neill’s house. The Ottawa Citizen heralded its court victory with a headline taken directly from the reaction uttered by David Asper, the vice president of CanWest Global, which owns the Citizen and dozens of other major Canadian media properties: “The Brute Force of the State Met Its Match”.

The only Canadian case that comes close to Choudhury’s suffering is the agony of Maher Arar, the Syrian-born Canadian engineer whose story looms over everything the O’Neill court case was about.

After certain RCMP officials provided American authorities with false information implicating Arar in terrorist activity, Arar was picked up while changing planes in New York in 2002. He was sent on to Damascus, where he was brutally tortured and kept in solitary confinement for much of the year he spent in prison there.

Choudhury’s troubles began around the same time. His brave work started getting him noticed outside Bangladesh, and he was invited to speak at a Hebrew Writers’ Association conference in Israel in 2003. He was arrested at the airport in Dacca before he could board his plane. He was jailed on charges of sedition and espionage, beaten, tortured, and kept in solitary confinement for 16 months.

It was only after an international public outcry—mainly in the United States, where he was honoured with a PEN USA Freedom to Write Award—that Choudhury was released. But the phony charges against him were recently revived by a notorious Islamist judge, and it’s on those charges that Choudhury’s trial resumes next month.

It was also only after a public outcry, here in Canada, that Syrian authorities agreed to let Arar come home to Canada, in 2003. But rogue elements within the RCMP, hoping to cover their tracks in the case, immediately revived their smear campaign against Arar. They fed O’Neill and other journalists the most vicious and outrageous lies about him. Finding out who those rogues were was the main reason the RCMP raided O’Neill’s house two years ago.

Last month, the commission of inquiry into the Arar affair—the analysis and recommendations section alone is 376 pages long—found that there never was a shred of evidence against him after all. No secret al-Qaeda code name, no time spent in Afghan desert training camps, no facilitation of terrorism- logistics work in and around Ottawa, nothing. None of those things you read about Arar in the newspapers was true.

But Justice Dennis O’Connor didn’t just lay the blame for Arar’s destroyed reputation at the feet of rogue Mounties. O’Connor’s report is just as scathing about those same Canadian journalists who now crow about their valiant defiance of the “brute force” of the Canadian state in the Arar case. Their court challenge was all about defending their right to continue hiding the identities of the cops who told all those lies and caused Arar such suffering to begin with.

We all make mistakes. I don’t claim to be braver than Juliet O’Neill, and I make no charge of bad faith against her. The Georgia Straight is no braver than the Ottawa Citizen, either. We don’t need to be. Life is easy here. This is Canada. It’s not, say, Bangladesh, where 12 journalists have been murdered during the past four years, and where the dark shadow of Islamist extremism grows longer by the hour.

There is one small thing we might all do, though, to redeem the tawdriness of our vocation in this country, as an act of contrition for Maher Arar. Given its reach, CanWest Global could be particularly helpful in that one small thing.

It’s in the matter of a brave, 43-year-old journalist, with failing eyesight, who lives in fear for his life, as I write this, in Dacca, Bangladesh. The last time he was in trouble, it was an international public outcry that got him out of jail.

His name is Salah Uddin Shoaib Choudhury.

The Chronicles blog can be found at transmontanus.blogspot.com/

Sunday, October 29, 2006

A Short History of the Bush Administration (Marc Cooper)

Marc Cooper constructs a useful brief outline of the Bush II experience from a series of discarded slogans:
"We will restore honor and integrity to the White House."
"Wanted: Dead or Alive."
"Mission Accomplished."
"Freedom is on the march."
"As Iraqis stand up, we will stand down."
"We will stay the course."
This is a pretty ingenious approach to summing up the Bush Presidency (which, alas, still has two more years to run). With a little thought, further illuminating examples readily come to mind.

=> The first of these abandoned slogans, the Bush/Cheney pledge to "restore honor and integrity to the White House," takes us back to those far-away days of 2000. It may now seem hard to believe, but that really was a pervasive theme of Bush's 2000 Presidential campaign ... which, in reality, gave us one of the most systematically dishonest, secretive, irresponsible, pervasively incompetent, and constitutionally high-handed administrations in American history. (And remember "compassionate conservatism," "a uniter, not a divider," and all the rest?)

In 2004 a brilliant little parody in the The Onion recaptured this broken pledge with an imaginary Bush campaign statement full of direct quotations from his 2000 campaign. Read it and weep (or laugh, or both).
Addressing guests at a $2,000-a-plate fundraiser, George W. Bush pledged Monday that, if re-elected in November, he and running mate Dick Cheney will "restore honor and dignity to the White House."
"After years of false statements and empty promises, it’s time for big changes in Washington,” Bush said. “We need a president who will finally stand up and fight against the lies and corruption. It’s time to renew the faith the people once had in the White House. If elected, I pledge to usher in a new era of integrity inside the Oval Office."
Bush told the crowd that, if given the opportunity, he would work to reestablish the goodwill of the American people "from the very first hour of the very first day" of his second term.
"The people have spoken," Bush said. "They said they want change. They said it’s time to clean up Washington. They’re tired of politics as usual. They’re tired of the pursuit of self-interest that has gripped Washington. They want to see an end to partisan bickering and closed-door decision-making. If I’m elected, I’ll make sure that the American people can once again place their trust in the White House."
Bush said the soaring national debt and the lengthy war in Iraq have shaken Americans’ faith in the highest levels of government.
--Jeff Weintraub

Good news & bad news from Bangladesh - Mohammad Yunus & Salah Choudhury

A recent letter to the Washington Post from a reader, Roberta Dzubow, brought out a painful contrast.
The Nobel Peace Prize awarded to Muhammad Yunus of Bangladesh [see here & here --JW] is well deserved.

Mr. Yunus began his program by giving small loans to Bangladeshi women. The women used the seed money wisely, thrived and repaid their loans. It should be especially significant in Muslim countries, where women are usually undervalued and oppressed, that over 97 percent of the 6.6 million loans made during the past 30 years have been to women.

However, as Bangladesh basks in the glow of Mr. Yunus's award, the spotlight of shame should also be on it. The Bangladeshi government arrested and tortured journalist Salah Uddin Shoaib Choudhury and now threatens him with death. [....]
Ami Isseroff (of MidEastWeb), who has paid close attention to Choudhury's case, sent a follow-up letter to the Washington Post that powerfully drives this point home. It's worth putting in boldface: Choudhury "faces the death penalty for advancing the cause of moderation. Unless his case gets significant support from abroad he may well die."
Dear editor,

I would like to call your attention to the plight of Bangladesh journalist Salahuddin Shoaib Choudhury, who is on trial for "sedition" in Bangladesh. As noted by a Washington Post reader [here], Choudhury's "crime" was that he wanted Bangladesh to open diplomatic relations with Israel and he spoke out against Islamist extremists. These are policies of the US government. Sedition carries the death penalty in Bangladesh. The offices of his journal have been ransacked and he and members of his staff have been beaten.

This man faces the death penalty for advancing the cause of moderation. Unless his case gets significant support from abroad he may well die. Meanwhile, he has suffered a three-year Kafkaesque nightmare of judicial harassment. Yet most major US newspapers have not written about his plight. For the most part he has been ignored by the US State Department, by Human Rights Watch and Amnesty International, though RSF and PEN have tried to help him.

I call on the Washington Post to bring this case to the attention of the American public.

Thank you,
Sincerely,
Ami Isseroff )www.mideastweb.org)
I reproduce this letter with permission from Ami Isseroff, who added a personal note via e-mail:
I searched Google for news of Salah's trial. I found nothing from major journals in the US. Only right-wing periodicals and Jewish specialty journals and the Jerusalem Post in Israel seem interested in the case. I found a lone entry for the Washington Post, which turned out to be a reader comment. The silence is deafening. That is what triggered my letter the Washington Post. [my bolding --JW]
=> This widespread indifference (with just a few honorable exceptions) is indeed shocking and alarming. Choudhury's case is one that should concern all of us. As I said in a recent post, Freedom of the press under attack - Bangladeshi journalist Salah Choudhury faces the death penalty:
--------------------
This is not just a tale of woe, but also a call to action.Over the past decade there have been several significant cases involving the persecution, arrest, and/or or prosecution of writers and intellectuals where international attention has helped to avert, or at least moderate, unjust and repressive outcomes. [....] International response to these cases, and international solidarity with the victims, are obviously very important to help preserve some space for freedom of expression and to encourage possibilities for political liberty and political sanity.

The case of the outspoken Bangladeshi journalist Salah Uddin Shoaib Choudhury, now on trial facing a trumped-up charge of treason with a possible death penalty, is another important challenge of this sort. [....] He has faced years of persecution, including physical attacks and death threats as well as criminal prosecution, for his 'crimes' of criticizing Islamist radicalism and advocating reconciliation with Christians, Jews, and Israel. [....]

Choudhury was awarded the PEN-USA Freedom to Write Award in 2005, and his cause has been taken up by Writers in Prison Committee of International PEN. Their statement of October 10 [here] urges that everyone committed to freedom of expression should:
Send appeals to authorities:
- expressing serious concerns for the safety of journalist Salah Uddin Choudhury
- calling for him to be provided with immediate and effective police protection
- protesting the charges against Choudhury and calling for them to be dropped in accordance with Article 19 of the United Nations Universal Declaration of Human Rights [....]
However, this case is still not getting nearly the attention that it deserves. It seems to me that Choudhury's cause is one that all people who support the principles of political and intellectual freedom and who would like to defend possibilities for democracy, political sanity, and constructive international dialogue should be especially interested in taking up. And the reasons go beyond the obvious threat to freedom of the press and free expression that this case represents, though these should be sufficient. Journalists in the Muslim world who are willing to stick their necks out to take positions like Choudhury's are not entirely non-existent, but they're not very numerous either, and they take especially great risks when they do this. If they're going to get their necks cut off for it, then all of us will be losers. They deserve strong and principled support. [.....]

Therefore, along with International PEN and others, I strongly urge people to spread the word about this case and to write to the Bangladeshi government expressing their concern. [....]
--------------------

For further information and some relevant addresses, see HERE.

Yours for freedom of expression and democratic solidarity,
Jeff Weintraub

P.S. During an earlier stage of Choudhury's ordeal, in 2003, the New York Times published a strong editorial supporting him ("The Risks of Journalism in Bangladesh"). In 2006, for some reason, the NYTimes seems to have fallen silent about this matter, but what they said in 2003 is still very much on target.

Scholasticism, charisma, & rationalization in the modern university - Or, rationalization and its limits (New Yorker)

The October 23 issue of the New Yorker carried an elegant, engaging, and thought-provoking piece by the historian Anthony Grafton about the curiously hybrid character of the modern university and the ways that it has taken form historically. Grafton's piece is a review essay on a recent book by another historian, William Clark, Academic Charisma and the Origins of the Research University.

"Academic charisma?", some readers may ask. Well, here is beautiful example quoted by Grafton, in which Mark Twain describes his first sight of the great German historian and academic empire-builder Hans Mommsen.
--------------------
At a Berlin banquet in 1892, Mark Twain, himself a worldwide celebrity, stared in amazement as a crowd of a thousand young students “rose and shouted and stamped and clapped, and banged the beer-mugs” when the historian Theodor Mommsen entered the room:
This was one of those immense surprises that can happen only a few times in one’s life. I was not dreaming of him; he was to me only a giant myth, a world-shadowing specter, not a reality. The surprise of it all can be only comparable to a man’s suddenly coming upon Mont Blanc, with its awful form towering into the sky, when he didn’t suspect he was in its neighborhood. I would have walked a great many miles to get a sight of him, and here he was, without trouble, or tramp, or cost of any kind. Here he was, clothed in a titanic deceptive modesty which made him look like other men. Here he was, carrying the Roman world and all the Caesars in his hospitable skull, and doing it as easily as that other luminous vault, the skull of the universe, carries the Milky Way and the constellations.
--------------------

Those were the days....

But with appropriate adjustments, it's actually not difficult to come up with examples of academic cult figures from the more recent past and present (names like Einstein, Bourdieu, Furet, Galbraith, Geertz, and Chomsky come to mind)--though one also has to add some key qualifications. One is that the academic scholar-stars who attract the greatest authority and respect within their own disciplines are often not the ones who get it from wider public opinion, or even from academics in other disciplines. Another is that these figures exist in an academic culture largely populated by different and more prosaic types--in particular, masses of academic worker-bees routinely churning out "research" and an even larger teacher-proletariat.

Furthermore, these and other academic types operate in an institutional world that involves a complex and shifting mixture of craft, patrimonial, and pseudo-industrial practices; guild-style collegial structures and bureaucratic-administrative authority; meritocratic, personalist, and hierarchical elements in both ideology and practice; forward-looking progressive rhetoric combined with peculiar appeals to traditionalism (real and invented) and emotional loyalties; and so on.

Clark's book, which I haven't yet read, is far from the first effort to make sense of this puzzling phenomenon, but Grafton's review makes it sound like an especially interesting and perceptive one. Meanwhile, Grafton's own discussion is interesting enough. Grafton's central point is that if one wants to understand the character of the modern western university, a necessary starting point is to consider the historical process by which this peculiar institution took shape, since that history continues to leave its marks on the present. The perspective that he and Clark bring to bear is a loosely Weberian one:
--------------------
But what does the academic agenda of the modern research-based university have to do with the other side of college life as we know it—with fraternity pledges, the choruses of “Gaudeamus igitur,” the stone façades of Victorian Gothic buildings? The mixed inheritance of the modern university is the subject of a new book with the somewhat oxymoronic title “Academic Charisma and the Origins of the Research University,” by William Clark, a historian who has spent his academic career at both American and European universities. Clark thinks that the modern university, with its passion for research, prominent professors, and, yes, black crêpe, took shape in Germany in the eighteenth and nineteenth centuries. And he makes his case with analytic shrewdness, an exuberant love of archival anecdote, and a wry sense of humor. It’s hard to resist a writer who begins by noting, “Befitting the subject, this is an odd book.”

Clark’s story starts in the Middle Ages. The organizations that became the first Western universities, schools that sprang up in Paris and Bologna, were in part an outgrowth of ecclesiastical institutions, and their teachers asserted their authority by sitting, like bishops, in thrones—which is why we still refer to professorships as chairs—and speaking in a prescribed way, about approved texts. “The lecture, like the sermon, had a liturgical cast and aura,” Clark writes. “One must be authorized to perform the rite, and must do it in an authorized manner. Only then does the chair convey genuine charisma to the lecturer.” Clark assumes his notion of charisma, loosely but clearly, from the work of Max Weber, who developed the idea that authority assumes three forms. Traditional authority, the stable possession of kings and priests, rested on custom, “piety for what actually, allegedly or presumably has always existed.” Charismatic authority, wild and disruptive, derived from “the exceptional sanctity, heroism or exemplary character of an individual person.” Rational authority, the last of the three forms to emerge, represented the rise of bureaucratic procedure, dividing responsibilities and following precise rules.

As Weber pointed out, in real organizations these different forms of authority interact and collide. In the medieval classroom, for all its emphasis on tradition-bound hierarchy and order, a contrary force came into play, one that unleashed the charisma of talented individuals: the disputation, in which a respondent affirmed the thesis under discussion and an opponent attempted to refute it. (Unlike the lecture, the disputation hasn’t survived as an institution, but its modern legacy includes the oral defenses that Ph.D. candidates make of their theses, and the format of our legal trials.) [....]

Traditionalist plodders and charismatic firebrands shared the university from the beginning. The heart of Clark’s story, however, takes place not during the Middle Ages but from the Renaissance through the Enlightenment, and not in France but in the German lands of the Holy Roman Empire. This complex assembly of tiny territorial states and half-timbered towns had no capital to rival Paris, but the little clockwork polities transformed the university through the simple mechanism of competition. German officials understood that a university could make a profit by attaining international stature. Every well-off native who stayed home to study and every foreign noble who came from abroad with his tutor—as Shakespeare’s Hamlet left Denmark to study in Saxon Wittenberg—meant more income. And the way to attract customers was to modernize and rationalize what professors and students did.

These German polities called themselves “police states”—not in the sense of being oppressive but, as Clark explains, in the sense that they tried “to achieve the good policing, die gute Policey, of the land by monitoring and regulating the behavior of subjects by paperwork.” [....] Bureaucracy has its own logic, and officials pushed for results that looked rational: results that they could codify, sort, and explain to their masters. Glacially, the universities responded. [....]

In an even more radical break with the past, professors began to be appointed on the basis of merit. [....] Around the turn of the nineteenth century, the pace of transformation reached a climax.

In these years, intellectuals inside and outside the university developed a new myth, one that Clark classes as Romantic. They argued that Wissenschaft—systematic, original research unencumbered by superstition or the authority of mere tradition—was the key to all academic achievement. If a university wanted to attract foreign students, it must appoint professors who could engage in such scholarship. At a great university like Göttingen or Berlin, students, too, would do original research, writing their own dissertations instead of paying the professors to do so, as their fathers probably had. Governments sought out famous professors and offered them high salaries and research funds, and stipends for their students. The fixation on Wissenschaft placed the long-standing competition among universities on an idealistic footing.

Between 1750 and 1825, the research enterprise established itself, along with institutions that now seem eternal and indispensable: the university library, with its acquisitions budget, large building, and elaborate catalogues; the laboratory; the academic department, with its fellowships and specialized training. So did a new form of teaching: the seminar, in which students learned by doing, presenting reports on their original research for the criticism of their teachers and colleagues. The new pedagogy prized novelty and discovery; it was stimulating, optimistic, and attractive to students around the world. Some ten thousand young Americans managed to study in Germany during the nineteenth century. There, they learned that research defined the university enterprise. And that is why we still make our graduate students write dissertations and our assistant professors write books. The multicultural, global faculty of the American university still inhabits the all-male, and virtually all-Christian, research universities of Mommsen’s day. [....]

[And so on up to the present. --JW] Today, academic charisma—and the ascetic life of scholarship that goes with it—retains a central place in the life of universities. Scholars in all fields continue to gain preferment because they are “productive” (the academic euphemism for obsessive), and students continue to emulate them. Future investment bankers pull all-nighters delving into subjects that they will never need to know about again, and years later, at reunions, they recall the intensity of the experience with something close to disbelief—and, often, passionate nostalgia. The university has never been a sleek, efficient corporation. It’s more like the military, an organization at once radically modern and steeped in color and tradition. And it’s not at all easy to say how much of the mystique could be stripped away without harming the whole institution. If you thoroughly rationalize charisma, can it remain charismatic?
--------------------

While this isn't the whole story, it certainly seems to capture an important part of it. Grafton nicely sums up the central insight organizing this account.
Universities are strange and discordant places because they are palimpsests of the ancient and the modern. Their history follows a Weberian narrative of rationalization, but it also reveals the limits of that rationalization.
As it happens--and as Grafton himself implies--that model can usefully be applied to a very wide variety of institutions in the modern world, in arenas ranging from politics to war and even the capitalist enterprise. But Grafton's piece does a fine job of outlining some specific ways that this dynamic has played itself out in shaping a social world that many of us have experienced and in some cases still inhabit, that of the modern university. Along the way, it weaves in such well-known tidbits as the castration of Abelard (a side-effect of his excessive academic charisma) and Edward Gibbon's reasons for dropping out of Oxford.

You don't even need to agree with this essay to find it charming, enjoyable, intelligent, and illuminating, so read the whole thing (below).

--Jeff Weintraub
====================
New Yorker
October 23, 2006
THE NUTTY PROFESSORS

The history of academic charisma

By Anthony Grafton

Anyone who has ever taught at a college or university must have had this experience. You’re in the middle of something that you do every day: standing at a lectern in a dusty room, for example, lecturing to a roomful of teen-agers above whom hang almost visible clouds of hormones; or running a seminar, hoping to find the question that will make people talk even though it’s spring and no one has done the reading; or sitting in a department meeting as your colleagues act out their various professional identities, the Russian historians spreading gloom, the Germanists accidentally taking Poland, the Asianists grumbling about Western ignorance and lack of civility, and the Americanists expressing surprise at the idea that the world has other continents. Suddenly, you find yourself wondering, like Kingsley Amis’s Lucky Jim, how you can possibly be doing this. Why, in the age of the World Wide Web, do professors still stand at podiums and blather for fifty minutes at unruly mobs of students, their lowered baseball caps imperfectly concealing the sleep buds that rim their eyes? Why do professors and students put on polyester gowns and funny hats and march, once a year, in the uncertain glory of the late spring? Why, when most of our graduate students are going to work as teachers, do we make them spend years grinding out massive, specialized dissertations, which, when revised and published, may reach a readership that numbers in the high two figures? These activities seem both bizarre and disconnected, from one another and from modern life, and it’s no wonder that they often provoke irritation, not only in professional pundits but also in parents, potential donors, and academic administrators.

Not that long ago, universities played a very different role in the public imagination, and top academics seemed to glitter as they walked. At a Berlin banquet in 1892, Mark Twain, himself a worldwide celebrity, stared in amazement as a crowd of a thousand young students “rose and shouted and stamped and clapped, and banged the beer-mugs” when the historian Theodor Mommsen entered the room:

This was one of those immense surprises that can happen only a few times in one’s life. I was not dreaming of him; he was to me only a giant myth, a world-shadowing specter, not a reality. The surprise of it all can be only comparable to a man’s suddenly coming upon Mont Blanc, with its awful form towering into the sky, when he didn’t suspect he was in its neighborhood. I would have walked a great many miles to get a sight of him, and here he was, without trouble, or tramp, or cost of any kind. Here he was, clothed in a titanic deceptive modesty which made him look like other men. Here he was, carrying the Roman world and all the Caesars in his hospitable skull, and doing it as easily as that other luminous vault, the skull of the universe, carries the Milky Way and the constellations.

Mommsen’s fantastic energy and work ethic—he published more than fifteen hundred scholarly works—had made him a hero, not only among scholars but to the general public, a figure without real parallels today. The first three volumes of his “History of Rome,” published in the eighteen-fifties, were best-sellers for decades and won him the Nobel Prize in Literature in 1902. Berlin tram conductors pointed him out as he stood in the street, leaning against a lamppost and reading: “That is the celebrated Professor Mommsen: he loses no time.” Mommsen was as passionately engaged with the noisy, industrializing present as with the ancient past. As a liberal member of the Prussian legislature, he fought racism, nationalism, and imperialism, and clashed with Bismarck. Yet Mommsen knew how to coöperate with the government on the things that really mattered. He favored reorganizing research in the humanities along the autocratic, entrepreneurial lines of the big businesses of his time—companies like Siemens and Zeiss, whose scientific work was establishing Germany as the leading industrial power in Europe. This approach essentially gave rise to the research team, a group of scholars headed by a distinguished figure which receives funding to achieve a particular goal. Mommsen’s view was that “large-scale scholarship—not pursued, but directed, by a single man—is a necessary element in our cultural evolution.” He won public support for such enterprises as a vast collection, still being amassed, of the tens of thousands of inscriptions that show, more vividly than any work of literature, what Roman life was like. He also advised the Prussian government on academic appointments, and helped make the University of Berlin and the Prussian Academy of Sciences the widely envied scientific center of the West—the Harvard, you might say, of the nineteenth century.

The model that Mommsen represented was revered and imitated around the world. In the United States, the new universities founded after the Civil War—Clark, Johns Hopkins, and Chicago—set out to gain prominence as Berlin had: by becoming research institutions and competing to attract faculty stars. In 1892, the University of Chicago, then two years old, wooed the historian Hermann von Holst away from Freiburg by promising him more than five times his previous salary. New labs and libraries popped up in cities and college towns across the country—at least until the Depression and the Second World War created other priorities. The age of academic prosperity that has lasted, with interruptions, from the nineteen-eighties to the present, and that has inspired campus novels and provoked skirmishes in the culture wars, has arguably been little more than an ironic replay of that late-nineteenth-century zenith, with academic stars fighting as hard for their own preferment as Mommsen did for the young and gifted.

But what does the academic agenda of the modern research-based university have to do with the other side of college life as we know it—with fraternity pledges, the choruses of “Gaudeamus igitur,” the stone façades of Victorian Gothic buildings? The mixed inheritance of the modern university is the subject of a new book with the somewhat oxymoronic title “Academic Charisma and the Origins of the Research University,” by William Clark, a historian who has spent his academic career at both American and European universities. Clark thinks that the modern university, with its passion for research, prominent professors, and, yes, black crêpe, took shape in Germany in the eighteenth and nineteenth centuries. And he makes his case with analytic shrewdness, an exuberant love of archival anecdote, and a wry sense of humor. It’s hard to resist a writer who begins by noting, “Befitting the subject, this is an odd book.”

Clark’s story starts in the Middle Ages. The organizations that became the first Western universities, schools that sprang up in Paris and Bologna, were in part an outgrowth of ecclesiastical institutions, and their teachers asserted their authority by sitting, like bishops, in thrones—which is why we still refer to professorships as chairs—and speaking in a prescribed way, about approved texts. “The lecture, like the sermon, had a liturgical cast and aura,” Clark writes. “One must be authorized to perform the rite, and must do it in an authorized manner. Only then does the chair convey genuine charisma to the lecturer.” Clark assumes his notion of charisma, loosely but clearly, from the work of Max Weber, who developed the idea that authority assumes three forms. Traditional authority, the stable possession of kings and priests, rested on custom, “piety for what actually, allegedly or presumably has always existed.” Charismatic authority, wild and disruptive, derived from “the exceptional sanctity, heroism or exemplary character of an individual person.” Rational authority, the last of the three forms to emerge, represented the rise of bureaucratic procedure, dividing responsibilities and following precise rules.

As Weber pointed out, in real organizations these different forms of authority interact and collide. In the medieval classroom, for all its emphasis on tradition-bound hierarchy and order, a contrary force came into play, one that unleashed the charisma of talented individuals: the disputation, in which a respondent affirmed the thesis under discussion and an opponent attempted to refute it. (Unlike the lecture, the disputation hasn’t survived as an institution, but its modern legacy includes the oral defenses that Ph.D. candidates make of their theses, and the format of our legal trials.) Clark calls the disputation a “theater of warfare, combat, trial and joust,” and, indeed, early proponents likened it to the contests of athletic champions in ancient Rome.

One early academic champion was the Parisian master Abelard, who cunningly used the format of the disputation to point up the apparent inconsistencies in orthodox Christian doctrine. He lined up the discordant opinions of the Fathers of the Church under the deliberately provocative title “Sic et Non” (“Yes and No”) and invited all comers to debate how the conflicts might be resolved. His triumphs in these “combats” made him, arguably, the first glamorous Parisian intellectual. A female disciple, Héloïse, wrote to him, “Every wife, every young girl desired you in absence and was on fire in your presence.” Their story has become a legend because of what followed: Héloïse, unwed, had a child by Abelard, her kin castrated him in revenge, and they both lived out their lives, for the most part, in cloisters. But even after Abelard’s writings were condemned and burned, pupils came from across Europe hoping to study with him. He had the enduring magnetism of the hotshot who can outargue anyone in the room.

Traditionalist plodders and charismatic firebrands shared the university from the beginning. The heart of Clark’s story, however, takes place not during the Middle Ages but from the Renaissance through the Enlightenment, and not in France but in the German lands of the Holy Roman Empire. This complex assembly of tiny territorial states and half-timbered towns had no capital to rival Paris, but the little clockwork polities transformed the university through the simple mechanism of competition. German officials understood that a university could make a profit by attaining international stature. Every well-off native who stayed home to study and every foreign noble who came from abroad with his tutor—as Shakespeare’s Hamlet left Denmark to study in Saxon Wittenberg—meant more income. And the way to attract customers was to modernize and rationalize what professors and students did.

These German polities called themselves “police states”—not in the sense of being oppressive but, as Clark explains, in the sense that they tried “to achieve the good policing, die gute Policey, of the land by monitoring and regulating the behavior of subjects by paperwork.” At first, what Policey meant for the universities was just finding out what the professors were up to. Bureaucrats pressured universities to print catalogues of the courses they offered—the early modern ancestor of the bright brochures that spill from the crammed mailboxes of families with teen-age children. Gradually, the bureaucrats devised ways to insure that the academics were fulfilling their obligations. In Vienna, Clark notes, “a 1556 decree provided for paying two individuals to keep daily notes on lecturers and professors”; in Marburg, from 1564 on, the university beadle kept a list of skipped lectures and gave it, quarterly, to the rector, who imposed fines. Others demanded that professors fill in Professorenzetteln, slips of paper that gave a record of their teaching activities. Professorial responses to such bureaucratic intrusions seem to have varied as much then as they do now. Clark reproduces two Professorenzetteln from 1607 side by side. Michael Mästlin, an astronomer and mathematician who taught Kepler and was an early adopter of the Copernican view of the universe, gives an energetic full-page outline of his teaching. Meanwhile, Andreas Osiander, a theologian whose grandfather had been an important ally of Luther, writes one scornful sentence: “In explicating Luke I have reached chapter nine.”

Bureaucracy has its own logic, and officials pushed for results that looked rational: results that they could codify, sort, and explain to their masters. Glacially, the universities responded. The old disputations were discontinued. These had always placed greater emphasis on formal skill in argument than on truth of outcome, and during the Baroque period and the Enlightenment they came to seem sterile and farcical. (Rather like department meetings and creative-writing workshops today, they had begun to inspire biting satires.) Instead, the universities instituted formal examinations—exercises that were carefully graded and recorded by those who administered them. Doctoral candidates had to defend printed dissertations. Clark wonderfully describes these strenuous, scary exercises. When Dorothea Schlözer, the daughter of a professor, underwent her examination for a doctorate at Göttingen in 1787, she confronted a committee of seven examiners. In deference to her sex, she was seated not at the far end of the table, facing the professors, but between two of them. The examination—which was interrupted for tea—allowed for masterly displays of professorial snideness. One professor “pulled a rock out of his pocket and asked her to classify it. After a couple more questions, he said he was going to ask her one on the binomial theorem, but, as he reckoned most of his own colleagues knew nothing of it, he decided to skip it.” The student calmly outperformed her masters. When another professor asked about art history, she noted that she had not listed this topic on her résumé, and thus should not be asked about it—but then she answered anyway. After about two hours, a professor who had been silent until then interrupted a colleague to note that “it was 7:30 and time to quit.” Schlözer passed.

In an even more radical break with the past, professors began to be appointed on the basis of merit. In many universities, it had been routine for sons to succeed their fathers in chairs, and bright male students might hope to gain access to the privileged university caste by marrying a professor’s daughter. By the middle of the eighteenth century, however, reformers in Hanover and elsewhere tried to select and promote professors according to the quality of their published work, and an accepted hierarchy of positions emerged. The bureaucrats were upset when a gifted scholar like Immanuel Kant ignored this hierarchy and refused to leave the city of his choice to accept a desirable chair elsewhere. Around the turn of the nineteenth century, the pace of transformation reached a climax.

In these years, intellectuals inside and outside the university developed a new myth, one that Clark classes as Romantic. They argued that Wissenschaft—systematic, original research unencumbered by superstition or the authority of mere tradition—was the key to all academic achievement. If a university wanted to attract foreign students, it must appoint professors who could engage in such scholarship. At a great university like Göttingen or Berlin, students, too, would do original research, writing their own dissertations instead of paying the professors to do so, as their fathers probably had. Governments sought out famous professors and offered them high salaries and research funds, and stipends for their students. The fixation on Wissenschaft placed the long-standing competition among universities on an idealistic footing.

Between 1750 and 1825, the research enterprise established itself, along with institutions that now seem eternal and indispensable: the university library, with its acquisitions budget, large building, and elaborate catalogues; the laboratory; the academic department, with its fellowships and specialized training. So did a new form of teaching: the seminar, in which students learned by doing, presenting reports on their original research for the criticism of their teachers and colleagues. The new pedagogy prized novelty and discovery; it was stimulating, optimistic, and attractive to students around the world. Some ten thousand young Americans managed to study in Germany during the nineteenth century. There, they learned that research defined the university enterprise. And that is why we still make our graduate students write dissertations and our assistant professors write books. The multicultural, global faculty of the American university still inhabits the all-male, and virtually all-Christian, research universities of Mommsen’s day.

Clark leads the reader through these transformations, year by year and document by document. He also uses the ancient universities of Oxford and Cambridge as a traditionalist foil to the innovations of Germany. Well into the nineteenth century, these were the only two universities in England, and dons—who were not allowed to marry—lived side by side with undergraduates, in an environment that had about it more of the monastery than of modernity. The tutorial method, too, had changed little, and colleges were concerned less with producing great scholars than with cultivating a serviceable crop of civil servants, barristers, and clergymen. The eighteenth century, which saw the flowering of modern German academe, marked a nadir recorded by Edward Gibbon, the Magdalen College dropout who became the greatest historian of imperial Rome, in memorable (and slightly exaggerated) terms:

The fellows or monks of my time were decent easy men, who supinely enjoyed the gifts of the founder. Their days were filled by a series of uniform employments; the chapel and the hall, the coffee-house and the common room, till they retired, weary and well-satisfied, to a long slumber. From the toil of reading or thinking or writing they had absolved their conscience, and the first shoots of learning and ingenuity withered on the ground.

Yet, even at Oxford, some scientists and scholars offered innovative lecture courses, and, conversely, the innovative German universities did not abandon all the old ways of doing things. Professors continued to give lectures as well as to hold seminars. Academic ceremonies continued to take place, and continued to do a great deal for the reputations of universities—especially once the giving of honorary degrees began to attract the attention of newspapers. Invented traditions, moreover, proved as attractive as ancient ones—particularly at universities that drew young men of high birth and others with social pretensions. Nineteenth-century German students were even more dedicated to duelling with sabres and attending formal banquets (such as the one at which Twain saw Mommsen) than they were to original research. Twain himself was as charmed by the picturesque duelling corps and taverns of Heidelberg as he was by the avatars of modern Wissenschaft in Berlin.

Similarly, although the hiring of professors became more meritocratic, administrators faced the enduring problem of how to assess merit systematically. Clark demonstrates this by inviting us to accompany Friedrich Gedike, a Prussian minister, on the visits he made to fourteen universities in June and July, 1789, just as the French Revolution was breaking out. Where his sixteenth- and seventeenth-century predecessors would have asked about the character and teaching abilities of local professors—did they have an audience, were they punctual, were they too friendly with the students?—Gedike undertook a ruthless talent search in an academic world where states competed for researchers. At the University of Göttingen, for instance, a hub of innovation only half a century old, he found an interesting anomaly. Professors tended to remain frozen at their acquisition salaries unless they could extract more money with the leverage of an outside offer. And, because universities mostly wanted to hire professors whose greatest works were still ahead of them, junior professors were often paid more than senior ones. Hence, academics at Göttingen found the whole subject of salaries too embarrassing to discuss, and Gedike had to collect information from “sensible and well-informed students, rather than professors.”

Gedike asked sharp, precise questions, but his judgments were, necessarily, reliant on the words of the specialists he spoke to. His report offered a long and precise evaluation of Christian Gottlob Heyne, the classicist who had done more than any other professor to make Göttingen a world-class center of learning. But often he could do little more than offer character assessments—“timid,” “hypochondriac,” “very sinister and misanthropic”—of the eccentrics who dominated the various faculties. In essence, Gedike and his colleagues gathered academic gossip and passed it on. The opinions were compiled, the decisions were made, and the jobs were handed out, not solely on the basis of rational, informed scrutiny of candidates’ merits but also on the basis of what people who might know something had to say about who was hot and who was not. These procedures are all too familiar to anyone who has taken part in academic hiring decisions today. A committee sits in a room, discussing folders full of organized gossip—and, nowadays, densely technical reports—about professors at other universities. Then it does its best to decide which of them to hire and what it will take to attract them—even though no one in the room may be competent to sum up, much less assess, the work of the candidates in question. We apply our best hermeneutics to the C.V. and letters of recommendation, discount known feuds, add points for this and that—and then, somehow, arrive at a decision.

As Clark shows, the assessment of professors is only one incidence of a much larger phenomenon. Universities are strange and discordant places because they are palimpsests of the ancient and the modern. Their history follows a Weberian narrative of rationalization, but it also reveals the limits of that rationalization. Mommsen, for all his modernity, spoke and wrote elegant, lucid Latin, like the humanists of the Renaissance, and enjoyed traditional academic ceremonies. Modern universities sincerely try to find the best scholars and scientists, those who work on the cutting edge of their fields, but they are also keen to preserve the traditional aspects of their culture and like their professors to wear their gowns with an air. They hope that some undefined combination of these qualities will attract the best crop of seventeen-year-olds available.

In the end, Clark never fully anatomizes how individual academics—those strange creatures flapping about in their batlike gowns—came to possess inherent charisma, as opposed to the authority conferred on them by chairs, titles, and the other “material practices” that form the core of his study. After all, charisma is to some extent irreducible; in the classroom a scholar can inspire by sheer force of intellect and personality, an effect to which bureaucratic reports seldom do justice. But Clark is shrewd in charting one aspect of academic charisma—namely, the importance of asceticism in creating an aura of greatness. Mommsen, with his heroic self-control and self-abnegation, had many precursors. The roots of academic asceticism surely lie in the university’s monastic prehistory. Indeed, Gadi Algazi, an Israeli historian, has shown that although German scholars, unlike their English counterparts, were allowed to marry and set up households from the fifteenth century onward, they took endless pains to show that they demanded big houses only so that they could work uninterrupted and married only so that they could have orderly, well-run homes.

In the eighteenth and nineteenth centuries, professorial asceticism moved from the home to the workplace, where it took new forms, most notably that of productivity on an epic, and sometimes eccentric, scale. The new model professor wore himself out: greatness of mind and depth of learning, like beauty, could be attained only through suffering. Christian Gottlob Heyne, who integrated the visual arts into the formal study of antiquity, also ran Göttingen’s university library—one of the largest and best organized in Europe—and published reviews of some eight thousand of the books that he obtained and catalogued for the university’s collection. Heyne’s pupil Friedrich August Wolf became legendary by similar means. As a scholar, his importance rested on his 1795 “Prolegomena to Homer”—an enormously successful book, though only the first volume ever appeared and it was written in Latin—which argued that the Iliad and the Odyssey were collections of originally oral poems, assembled by the poet-scholars of Hellenistic Alexandria. But what really made him a celebrity was his combination of daring and self-denial. Wolf insisted on registering as a student not of theology but of philology, even though the few available jobs for graduates were for ministers rather than scholars. Heyne showed him his desk, piled with letters from schoolteachers “who tell me that they would be glad to be hanged, from actual destitution,” but Wolf persevered. He replaced the student’s usual pigtail with a wig, so that he would not have to go to the barber; stayed away from the taverns where students caroused and the salons where they met young women; and even stopped attending lectures, since he thought that his time could be more productively spent reading the assigned books. He infuriated his teacher by reading ahead of the class and taking out all the library books that Heyne needed to prepare his lectures. And his reward came soon: a professorship at Halle, at the age of twenty-four. This brilliant, bitter nonconformist paradoxically became a model for later generations of students. No wonder observers praised Mommsen’s ceaseless industry so extravagantly half a century later: he was not only doing history at a superb level but also living an ascetic ideal that still mattered.

Today, academic charisma—and the ascetic life of scholarship that goes with it—retains a central place in the life of universities. Scholars in all fields continue to gain preferment because they are “productive” (the academic euphemism for obsessive), and students continue to emulate them. Future investment bankers pull all-nighters delving into subjects that they will never need to know about again, and years later, at reunions, they recall the intensity of the experience with something close to disbelief—and, often, passionate nostalgia. The university has never been a sleek, efficient corporation. It’s more like the military, an organization at once radically modern and steeped in color and tradition. And it’s not at all easy to say how much of the mystique could be stripped away without harming the whole institution. If you thoroughly rationalize charisma, can it remain charismatic?

If Clark helps us to understand why the contemporary university seems such an odd, unstable compound of novelty and conservatism, he also leaves us with some cause for unease. Mommsen may have liked to see himself as a buccaneering capitalist, but his money came from the state. Today, by contrast, dwindling public support has forced university administrators to look for other sources of funding, and to assess professors and programs through the paradigm of the efficient market. Outside backers tend to direct their support toward disciplines that offer practical, salable results—the biological sciences, for instance, and the quantitative social sciences—and universities themselves have an incentive to channel money into work that will generate patents for them. The new regime may be a good way to get results, but it’s hard to imagine that this style of management would have found much room for a pair of eccentrics like James Watson and Francis Crick, or for the kind of long-range research that they did. As for the humanities, once the core of the enterprise—well, humanists these days bring in less grant money than Mommsen, and their salaries and working conditions reflect that all too clearly. The inefficient and paradoxical ways of doing things that, for all their peculiarity, have made American universities the envy of the world are changing rapidly. What ironic story will William Clark have to tell a generation from now?

Friday, October 27, 2006

Sexism & Islam in Australia - Bad news & good news

One of the most senior Muslim clerics in Australia, Sheik Taj Din al-Hilali, recently argued that immodestly dressed women who paraded themselves in public had only themselves to blame if they got raped. These sentiments are not so rare, but Sheik al-Hilali also formulated them using some especially appalling imagery, comparing such women to "uncovered meat" (according to this report in The Australian
In the religious address on adultery to about 500 worshippers in Sydney last month, Sheik Hilali said: "If you take out uncovered meat and place it outside on the street, or in the garden or in the park, or in the backyard without a cover, and the cats come and eat it ... whose fault is it, the cats or the uncovered meat?

"The uncovered meat is the problem."

The sheik then said: "If she was in her room, in her home, in her hijab, no problem would have occurred."

He said women were "weapons" used by "Satan" to control men.

"It is said in the state of zina (adultery), the responsibility falls 90 per cent of the time on the woman. Why? Because she possesses the weapon of enticement (igraa)."
And in case anyone failed to get the point:
In a Ramadan sermon that has outraged Muslim women leaders, Sydney-based Sheik Taj Din al-Hilali also alluded to the infamous Sydney gang rapes, suggesting the attackers were not entirely to blame.

While not specifically referring to the rapes, brutal attacks on four women for which a group of young Lebanese men received long jail sentences, Sheik Hilali said there were women who "sway suggestively" and wore make-up and immodest dress ... "and then you get a judge without mercy (rahma) and gives you 65 years".

"But the problem, but the problem all began with who?" he asked.
One of the achievements of feminism over the past few generations has been to challenge views of rape that blame the victim this way--and to do so effectively enough, at least in western societies, that they are no longer woven so tightly into legal practice and everyday common sense, and that public statements like these are considered offensive rather than respectable. These changes in perspective are still uneven and complex and incomplete, but there has been a genuine shift in fundamental attitudes.

On the other hand, among people who agree that this is a shift in the right direction, the moral clarity of these issues often gets a lot more confused with it comes to dealing with sexist mores among minority and immigrant communities, especially when these get tangled up with resurgent forms of religious fundamentalism.

=> In this particular incident, one can find elements of both bad news and good news. The bad news is obvious enough. Sheik Taj Din al-Hilali (who also has a history of blatantly anti-semitic statements, by the way) is not just some obscure reactionary cleric, but one of the most prominent religious figures in Australian Islam. As his Wikipedia entry explains:
The Australian Federation of Islamic Councils appointed him Mufti of Australia in 1988.[2] He presents himself as the Grand Mufti of Australia and New Zealand, but other Muslim groups dispute this title [3][4], which has also been described as honorary.[5]
The good news is that his statements provoked widespread public outrage within the Australian Muslim community itself. There seems to have been little inclination to circle the wagons defensively and reject any criticisms as Islamophobic. Quoting again from The Australian:
Muslim community leaders were yesterday outraged and offended by Sheik Hilali's remarks, insisting the cleric was no longer worthy of his title as Australia's mufti.

Young Muslim adviser Iktimal Hage-Ali - who does not wear a hijab - said the Islamic headdress was not a "tool" worn to prevent rape and sexual harassment. "It's a symbol that readily identifies you as being Muslim, but just because you don't wear the headscarf doesn't mean that you're considered fresh meat for sale," the former member of John Howard's Muslim advisory board told The Australian. "The onus should not be on the female to not attract attention, it should be on males to learn how to control themselves."

Australia's most prominent female Muslim leader, Aziza Abdel-Halim, said the hijab did not "detract or add to a person's moral standards", while Islamic Council of Victoria spokesman Waleed Ali said it was "ignorant and naive" for anyone to believe that a hijab could stop sexual assault.

"Anyone who is foolish enough to believe that there is a relationship between rape or unwelcome sexual interference and the failure to wear a hijab, clearly has no understanding of the nature of sexual crime," he said.

Ms Hage-Ali said she was "disgusted and offended" by Shiek Hilali's comments. "I find it very offensive that a man who considers himself as a mufti, a leader of Australia's Muslims, can give comment that lacks intelligence and common sense."
The Islamic Council of Victoria has called for the Sheikh to resign, and his remarks were also publicly condemned by the Islamic Council of New South Wales (see here).
The council said the remarks were "un-Islamic, un-Australian and unacceptable".

A spokesman for the council, Mr Ali Roude, said he was "astonished" at Sheik Alhilali's comments, saying he "had failed both himself and the Muslim community".

"While we respect the rights of any Australian citizen to freedom of speech, there is a further responsibility upon our civic leaders, be they religious, political or bureaucratic, to offer appropriate guidance to the people under their care," Mr Roude said.

"The comments widely reported today do no such thing."

Sheik Alhilali had seriously misrepresented the teachings of Islam in his comments, Mr Roude said, which were offensive to both sexes.

The comments also showed a deep misunderstanding of rape and personal violence, which Mr Roude described as a "crimes of power".

"As a father, brother and son myself, I take offence at the portrayal of both men and women in the alleged published comments," Mr Roude said.

"Islam requires all people, men and women alike, to dress with modesty.

"This is not to reduce the risk of sexual assault and rape, but rather to show respect for the God who created us all as equals and to show respect for ourselves as people who rise above the world of mere things and animals to stand as conscious beings in the presence of that same loving God - Allah Ta'ala."

Mr Roude said he had known Sheik Alhilali for many years and was deeply disappointed he had made the remarks, which were in no way shared or endorsed by the council.

"Any comments or actions which might lead any person, especially any Muslim, to despise or to objectify any other person are clearly contrary to the will of God," Mr Roude said.

"The comments reported today must be heard, read and understood in that context."
And in the face of these public rebukes, Sheik Taj Din al-Hilali felt compelled to apologize, sort of.

=> Looking at the bright side, these developments seem at least mildly encouraging. And the quoted remarks by Ali Roude include a point that is worth highlighting. One strategy that's sometimes proposed to deal with hate speech and other dangerous and pernicious opinions is to try to criminalize them. The temptations to take that route are understandable. But in free societies such statements can be countered more effectively by free speech, counter-arguments, and public condemnations than by legal controls. However, this requires that the people who matter are willing and able to use free speech for these purposes.

So the elements of good news here strike me as important and promising. But they also shouldn't encourage complacency or denial about the real and pervasive problems that this incident brings out.

=> For a useful overview of this affair so far, see the roundup by Marcus at Harry's Place. Below are translated excerpts from Sheikh al-Hilali's "controversial sermon."

--Jeff Weintraub

P.S. According to the Sheikh, atheists and Jews and Christians (that includes me, I guess) will all wind up in hell, "and not part-time, for eternity," since we are "the worst in God’s creation." On the other hand, there will be a lot of attractive women there, too, since "Satan sees women as half his soldiers."

Update: For some of the Sheik's further adventures, see here.
====================
SBS - World News Australia
October 27, 2006 - 12:51:08
Read Sheik Hilaly's comments


The following are extracts from Sheik Taj Din al-Hilaly's controversial sermon given last month, as independently translated by an SBS Arabic expert.

"Those atheists, people of the book (Christians and Jews), where will they end up? In Surfers Paradise? On the Gold Coast? Where will they end up? In hell and not part-time, for eternity. They are the worst in God’s creation."

"When it comes to adultery, it’s 90 percent the woman’s responsibility. Why? Because a woman owns the weapon of seduction. It’s she who takes off her clothes, shortens them, flirts, puts on make-up and powder and takes to the streets, God protect us, dallying. It’s she who shortens, raises and lowers. Then, it’s a look, a smile, a conversation, a greeting, a talk, a date, a meeting, a crime, then Long Bay jail. Then you get a judge, who has no mercy, and he gives you 65 years."
"But when it comes to this disaster, who started it? In his literature, writer al-Rafee says, if I came across a rape crime, I would discipline the man and order that the woman be jailed for life. Why would you do this, Rafee? He said because if she had not left the meat uncovered, the cat wouldn’t have snatched it."

"If you get a kilo of meat, and you don’t put it in the fridge or in the pot or in the kitchen but you leave it on a plate in the backyard, and then you have a fight with the neighbour because his cats eat the meat, you’re crazy. Isn’t this true?"

"If you take uncovered meat and put it on the street, on the pavement, in a garden, in a park, or in the backyard, without a cover and the cats eat it, then whose fault will it be, the cats, or the uncovered meat’s? The uncovered meat is the disaster. If the meat was covered the cats wouldn’t roam around it. If the meat is inside the fridge, they won’t get it."

"If the woman is in her boudoir, in her house and if she’s wearing the veil and if she shows modesty, disasters don’t happen."

"Satan sees women as half his soldiers. You’re my messenger in necessity, Satan tells women you‘re my weapon to bring down any stubborn man. There are men that I fail with. But you’re the best of my weapons."

"…The woman was behind Satan playing a role when she disobeyed God and went out all dolled up and unveiled and made of herself palatable food that rakes and perverts would race for. She was the reason behind this sin taking place."

Election 2006 - "Let's Take the House Before We Measure the Drapes" (Gerald McEntee)

Will the Democrats retake the House of Representatives in the November 7 election? Or, to put it the other way around, will Republicans lose their House majority, thus bringing an end to the one-party monopoly of power in the national government? Many signs point that way, but it is worth bearing in mind that this outcome is by no means a foregone conclusion. Following up Eric Alterman's cautionary words about election overconfidence which I passed along in a previous post (Election 2006 - Don't stop worrying yet!), here is a a useful sober-but-aggressive piece by Gerald McEntee, President of the American Federation of State, County and Municipal Employees (AFSCME),. (Passed along to me by Rosalind Spigel of the Philadelphia-area branch of the Jewish Labor Committee.) As McEntee properly emphasizes, at this point it's absolutely crucial to get out the vote (and, we might add, to keep a careful eye out for possible hanky-panky with vote-counting systems across the country).

Pessimisim of the intelligence, optimism of the will,
Jeff Weintraub
====================
The Huffington Post
October 25, 2006
Let's Take the House Before We Measure the Drapes
Gerald McEntee (Bio)

It's one thing to be optimistic about Democratic chances on November 7. It's quite another to convince ourselves of certain victory - tempting as it is. So with less than two weeks until Election Day, consider this: If we assume that good vibes and poll numbers will result in good turnout, we'll lose.

The signs of October overconfidence are everywhere, particularly when predicting control of the House of Representatives. In Sunday's New York Times, former Iowa Democratic State Party Chair Gordon R. Fischer said he had "moved from optimistic to giddy" about the party's prospects on November 7. Sunday's 60 Minutes practically handed Nancy Pelosi the Speaker's gavel. And behind-the-scenes jockeying for committee chairmanships has already begun.

Sure, we have plenty of reasons for optimism. In recent weeks, independent analysts have suggested that more and more Republican seats are now in play, expanding the field at a time when it is usually narrowing. The Iraq War, stagnant wages and the multiple ethics investigations have exposed the GOP as corrupt and dangerously inept. And then there's the depressing effect the Mark Foley scandal may have on the GOP base.

But those who believe these political pratfalls will translate into Democratic gains should remember that in three of the last four federal elections, the party expected to win big but lost huge. In 1998, analysts predicted that the Lewinsky scandal would net Newt and the GOP up to 30 House seats; the Democrats won five.

In 2000, exit polls showed Al Gore winning Florida and, with it, the presidency. And in 2004, many pundits confidently predicted a Kerry victory. Exit polls bore out their predictions all the way through election evening, with even Karl Rove reportedly offering a dire assessment of his boss's chances. Fast forward to midnight, and Democratic jubilation had evaporated into downright dejection.

History aside, there are more practical reasons to be wary of overconfidence. First, the GOP has gobs more money than we do. Last Friday alone, the NRCC dumped $8.5 million into close House races across the country, most of it for negative ads. And though the AFL-CIO, AFSCME and progressive organizations are committing record amounts (of troops and money), we simply cannot match Republican spending.

Second, as many in our movement have pointed out, the Republican GOTV operation has been superior to ours. If news accounts of Ken Mehlman's "Weekly Grassroots Report," are accurate, GOP volunteers broke a record this year in voter contacts, reaching more than one million voters in the past month. Need more proof? Look at the Rhode Island primary results, where the GOP managed to identify and coax just about every moderate Republican to the polls on behalf of Lincoln Chafee.

To be sure, we're catching up to the GOP on GOTV. For our part, AFSCME is deploying thousands of volunteers as part of our "Labor to Neighbor" program. Our members will walk and phone bank precincts in their own neighborhoods, rather than in other parts of the country. It is our most aggressive and targeted midterm GOTV operation ever.

Finally, Democrats must remember that wounded animals will do anything to survive - and elephants are no different. In 2002, Republicans lied about Max Cleland's patriotism and commitment to fighting terrorism. In 2004, they demonized gay Americans. This year, it's immigrants. And just last week the desperate GOP nominee for governor in Ohio, faced with a double-digit deficit in the polls, tried to link Democrat Ted Strickland to child sex predators.

I raise these issues not to crush our confidence, but to temper it. If we desire victory on November 7, we must back up our optimism with hard work. In the next two weeks, each of us must volunteer a night or two to phone bank and door knock. We must engage our friends and neighbors at church, high school football games and every other social function in our communities. And we must open our wallets. If any progressive leaders have a dime left over on November 8 that could have been used to change the climate in Washington, they should have their heads examined.

So for the final two weeks of this campaign, let's not measure the majority office drapes. Let's take the House.

Wednesday, October 25, 2006

Election 2006 - Don't stop worrying yet! (Eric Alterman)

Most polls, analysts, and pundits now agree that the Republicans are headed for trouble in the midterm elections, which a good chance of losing control of the House and even a slim chance of losing control of the Senate, if everything breaks right for the Democrats. However, as Eric Alterman regretfully points out, not quite everyone agrees with this prognosis. Essentially, the alternative scenario argues that the Republicans' money advantage will be decisive in the end, despite everything.
I don't really relish being right about this election, but Barron's did a race-by-race analysis, all 468 Congressional contests, taking into account cash on hand, as well as organization assets on the ground, and comes up with small Republican majorities in both houses as the most likely result. (That makes it me, Karl, Barron's, Jon Alter, and Mickey [Kaus] if you're keeping score.) [....] "Look at House races back to 1972 and you'll find the candidate with the most money has won about 93% of the time. And that's closer to 98% in more recent years, according to the Center for Responsive Politics." [according to Barron's ...] Well, I could be wrong. I've been wrong before. [....]
Let's hope so.

--Jeff Weintraub
====================
Eric Alterman (Altercation)
Tuesday, October 24, 2006

I don't really relish being right about this election, but Barron's did a race-by-race analysis, all 468 Congressional contests, taking into account cash on hand, as well as organization assets on the ground, and comes up with small Republican majorities in both houses as the most likely result. (That makes it me, Karl, Barron's, Jon Alter, and Mickey if you're keeping score.) Using the same methodology in the 2002 and 2004 congressional races, Barron's bucked conventional wisdom and correctly predicted GOP gains both years. "Look at House races back to 1972 and you'll find the candidate with the most money has won about 93% of the time. And that's closer to 98% in more recent years, according to the Center for Responsive Politics. ... Our method isn't quite as accurate in Senate races: The cash advantage has spelled victory about 89% of the time since 1996. The reason appears to be that with more money spent on Senate races, you need a multi-million-dollar advantage to really dominate in advertising, and that's hard to come by." Remember, money matters, not issues, not voters, not really much of anything, save money -- and, of course, the Republicans' natural structural advantages. (Well, I could be wrong. I've been wrong before. Media Matters says the methodology used by Barron's in 2002, 2004, and 2006 hasn't been consistent, here.)

Religion & republicanism in American political culture (George Will & Alexis de Tocqueville)

The conservative columnist & pundit George Will is uneven (and especially so around election time), but sometimes he is very much on-target. The piece below is one example. As Will properly notes:
Not since the medieval church baptized, as it were, Aristotle as some sort of early — very early — church father has there been an intellectual hijacking as audacious as the attempt to present America’s principal founders as devout Christians. Such an attempt is now in high gear among people who argue that the founders were kindred spirits with today’s evangelicals, and that they founded a “Christian nation.”

This irritates Brooke Allen, an author and critic who has distilled her annoyance into “Moral Minority: Our Skeptical Founding Fathers.” It is a wonderfully high-spirited and informative polemic that, as polemics often do, occasionally goes too far. Her thesis is that the six most important founders — Franklin, Washington, Adams, Jefferson, Madison and Hamilton — subscribed, in different ways, to the watery and undemanding Enlightenment faith called deism. That doctrine appealed to rationalists by being explanatory but not inciting: it made the universe intelligible without arousing dangerous zeal. [....]

Allen’s challenge is to square the six founders’ often pious public words and behavior with her conviction that their real beliefs placed all six far from Christianity. Her conviction is well documented, exuberantly argued and quite persuasive. [....]

When Franklin was given some books written to refute deism, the deists’ arguments “appeared to me much stronger than the refutations; in short, I soon became a thorough deist.” [....] What Allen calls Washington’s “famous gift of silence” was particularly employed regarding religion. But his behavior spoke. [....] He acknowledged Christianity’s “benign influence” on society, but no ministers were present and no prayers were uttered as he died a Stoic’s death. [....] Jefferson, writing as a laconic utilitarian, urged his nephew to inquire into the truthfulness of Christianity without fear of consequences: “If it ends in a belief that there is no god, you will find incitements to virtue in the comforts and pleasantness you feel in its exercise, and the love of others which it will procure you.” [....]

In 1781, the Articles of Confederation acknowledged “the Great Governor of the World,” but six years later the Constitution made no mention of God. When Hamilton was asked why, he jauntily said, “We forgot.” Ten years after the Constitutional Convention, the Senate unanimously ratified a treaty with Islamic Tripoli that declared the United States government “is not in any sense founded on the Christian religion.”
And furthermore:
Allen neglects one argument for her thesis that the United States is a “secular project”: the Constitution mandates the establishment of a political truth by guaranteeing each state the same form of government (“republican”). It does so because the founders thought the most important political truths are knowable. But because they thought religious truths are unknowable, they proscribed the establishment of religion.
But at the same time:
Two days after Jefferson wrote his letter endorsing a “wall of separation” between church and state, he attended, as he occasionally did, religious services in the House of Representatives. Jefferson was an observant yet unbelieving Anglican/Episcopalian throughout his public life. This was a statesmanlike accommodation of the public’s strong preference, which then as now was for religion to have ample space in the public square.
Like it or not, this kind of accommodation helped to set some of the enduring tendencies in American political culture. In countries whose political conflicts have been influenced, directly or indirectly, by the legacies of the French revolution, republicanism (whether in "bourgeois," nationalist, socialist, and/or Leninist forms) has often had to take the form of a counter-religion, with strong anti-clerical and militantly secularist elements. This pattern has predominated historically not only in the Catholic countries of Europe and Latin America, but also in Kemalist Turkey. In American political culture, on the other hand, the predominant pattern has involved a complementary co-existence of religion and democratic republicanism. In fact, it has been argued that, in this context,
Christianity, particularly its post-Reformation ferments, fostered attitudes and aptitudes associated with popular government.
One important thinker who argued this was, of course, Tocqueville. Tocqueville's own religious leanings have been the subject of much debate, but it has always seemed to me most plausible that in this respect his attitude was not that different from that of Adams, Madison, Jefferson, Hamilton and the other American Founding Fathers. At all events, Tocqueville's famous discussion of the significance of religion for political liberty (in Vol. I, Part II, ch. 9 of Democracy in America) does not strike the tone one might expect from an orthodox believer--but it could nevertheless be convincing to orthodox believers, and Tocqueville clearly hoped it would be.
I have just pointed out the direct action of religion on politics in the United States. Its indirect action seems to me much greater still, and it is just when it is not speaking of freedom at all that it best teaches the Americans the art of being free. [....]

Though it is very important for man as an individual that his religion should be true, that is not the case for society. Society has nothing to fear or hope from another life; what is most important for it is not that all citizens should profess the true religion but that they should profess religion.

Religion, which never intervenes directly in the government of American society, should therefore be considered as the most important of their political institutions, for although it did not give them the taste for liberty, it singularly facilitates their use thereof.
This perspective is likely to be irritating both to militant rationalists and to theocrats, among others, but at least they should face the fact that it captures the predominant understandings that have structured the relationship between religion and democratic republicanism in American political culture. Surveying the political alternatives offered by the past two centuries, there is something to be said for this accommodation.

--Jeff Weintraub
====================
New York Times (Book Review)
Sunday, October 22, 2006
God of our Fathers
By George F. Will

Not since the medieval church baptized, as it were, Aristotle as some sort of early — very early — church father has there been an intellectual hijacking as audacious as the attempt to present America’s principal founders as devout Christians. Such an attempt is now in high gear among people who argue that the founders were kindred spirits with today’s evangelicals, and that they founded a “Christian nation.”

This irritates Brooke Allen, an author and critic who has distilled her annoyance into “Moral Minority: Our Skeptical Founding Fathers.” It is a wonderfully high-spirited and informative polemic that, as polemics often do, occasionally goes too far. Her thesis is that the six most important founders — Franklin, Washington, Adams, Jefferson, Madison and Hamilton — subscribed, in different ways, to the watery and undemanding Enlightenment faith called deism. That doctrine appealed to rationalists by being explanatory but not inciting: it made the universe intelligible without arousing dangerous zeal.

Eighteenth-century deists believed there was a God but, tellingly, they frequently preferred synonyms for him — “Almighty Being” or “Divine Author” (Washington) or “a Superior Agent” (Jefferson). Having set the universe in motion like a clockmaker, Providence might reward and punish, perhaps in the hereafter, but does not intervene promiscuously in human affairs. (Washington did see “the hand of Providence” in the result of the Revolutionary War.) Deists rejected the Incarnation, hence the divinity of Jesus. “Christian deist” is an oxymoron.

Allen’s challenge is to square the six founders’ often pious public words and behavior with her conviction that their real beliefs placed all six far from Christianity. Her conviction is well documented, exuberantly argued and quite persuasive.

When Franklin was given some books written to refute deism, the deists’ arguments “appeared to me much stronger than the refutations; in short, I soon became a thorough deist.” Revelation “had indeed no weight with me.” He believed in a creator and the immortality of the soul, but considered these “the essentials of every religion.”

What Allen calls Washington’s “famous gift of silence” was particularly employed regarding religion. But his behavior spoke. He would not kneel to pray, and when his pastor rebuked him for setting a bad example by leaving services before communion, Washington mended his ways in his austere manner: he stayed away from church on communion Sundays. He acknowledged Christianity’s “benign influence” on society, but no ministers were present and no prayers were uttered as he died a Stoic’s death.

Adams declared that “phylosophy looks with an impartial Eye on all terrestrial religions,” and told a correspondent that if they had been on Mount Sinai with Moses and had been told the doctrine of the Trinity, “We might not have had courage to deny it, but We could not have believed it.” It is true that the longer he lived, the shorter grew his creed, and in the end his creed was Unitarianism.

Jefferson, writing as a laconic utilitarian, urged his nephew to inquire into the truthfulness of Christianity without fear of consequences: “If it ends in a belief that there is no god, you will find incitements to virtue in the comforts and pleasantness you feel in its exercise, and the love of others which it will procure you.”

Madison, always common-sensical, briskly explained — essentially, explained away — religion as an innate appetite: “The mind prefers at once the idea of a self-existing cause to that of an infinite series of cause & effect.” When Congress hired a chaplain, he said “it was not with my approbation.”

In 1781, the Articles of Confederation acknowledged “the Great Governor of the World,” but six years later the Constitution made no mention of God. When Hamilton was asked why, he jauntily said, “We forgot.” Ten years after the Constitutional Convention, the Senate unanimously ratified a treaty with Islamic Tripoli that declared the United States government “is not in any sense founded on the Christian religion.”

Allen neglects one argument for her thesis that the United States is a “secular project”: the Constitution mandates the establishment of a political truth by guaranteeing each state the same form of government (“republican”). It does so because the founders thought the most important political truths are knowable. But because they thought religious truths are unknowable, they proscribed the establishment of religion.

Allen succumbs to what her six heroes rightly feared — zeal — in her prosecution of today’s religious zealots. In a grating anachronism unworthy of her serious argument, she calls the founders “the very prototypes, in fact, of the East Coast intellectuals we are always being warned against by today’s religious right.” (Madison, an NPR listener? Maybe not.) When she says “Richard Nixon and George W. Bush, among other recent American statesmen,” have subscribed to the “philosophy” that there should be legal impediments to an atheist becoming president, she is simply daft. And when she says that Bible study sessions in the White House and Justice Department today are “a form of potential religious harassment that should be considered as unacceptable as the sexual variety,” she is exhibiting the sort of hostility to the free exercise of religion that has energized religious voters, to her sorrow.

Two days after Jefferson wrote his letter endorsing a “wall of separation” between church and state, he attended, as he occasionally did, religious services in the House of Representatives. Jefferson was an observant yet unbelieving Anglican/Episcopalian throughout his public life. This was a statesmanlike accommodation of the public’s strong preference, which then as now was for religion to have ample space in the public square.

Christianity, particularly its post-Reformation ferments, fostered attitudes and aptitudes associated with popular government. Protestantism’s emphasis on the individual’s direct, unmediated relationship with God, and the primacy of individual conscience and choice, subverted conventions of hierarchical societies in which deference was expected from the many toward the few. But beyond that, America’s founding owes much more to John Locke than to Jesus.

The founders created a distinctly modern regime, one respectful of pre-existing rights — natural rights, not creations of the regime. And in 1786, the year before the Constitutional Convention constructed the regime, Jefferson, in the preamble to the Virginia Statute for Religious Freedom, proclaimed that “our civil rights have no dependence on our religious opinions, any more than our opinions in physics or geometry.”

Since the founding, America’s religious enthusiasms have waxed and waned, confounding Jefferson’s prediction, made in 1822, four years before his death, that “there is not a young man now living in the United States who will not die an Unitarian.” In 1908, William Jennings Bryan, the Democrats’ presidential nominee, said his Republican opponent, William Howard Taft, was unfit because, being a Unitarian, he did not believe in the Virgin Birth. The electorate yawned and chose Taft.

A century on, when the most reliable predictor of a voter’s behavior is whether he or she regularly attends church services, it is highly unlikely that Republicans would nominate a Unitarian. In 1967, when Gov. George Romney of Michigan evinced interest in the Republican presidential nomination, his Mormonism was of little interest and hence was no impediment. Four decades later, the same may not be true if his son Mitt, also a Mormon, seeks the Republican nomination in 2008.

In 1953, the year before “under God” was added to the Pledge of Allegiance, President Dwight D. Eisenhower declared July 4 a day of “penance and prayer.” That day he fished in the morning, golfed in the afternoon and played bridge in the evening. Allen and others who fret about a possibly theocratic future can take comfort from the fact that America’s public piety is more frequently avowed than constraining.

George F. Will is a syndicated columnist.