Tuesday, March 30, 2010

Six books

About once a term I make a point of haranguing my students about the books they should have under their belt if they are serious about being English students. This lecture (rant) is one of several that tend to surface in my normal rotation, the others being (1) You all should read more history; (2) You all should read more, period; and (3) What the hell do you mean none of you have seen Casablanca? Seriously? (for this last category, substitute any classic film I mention in passing in class that elicits blank looks of incomprehension).

Unlike these three curmudgeonly rants however, the books-you-should-read-to-major-in-English riff is one I do in the hopes that it might actually sink in with some people. Six books, I say—there are six books you should read if you want to understand the vast majority of the allusions in English literature written before 1950. They are as follows:

Homer, The lliad and The Odyssey
Virgil, The Aeneid
Ovid, Metamorphoses
Dante, The Divine Comedy
The Bible

I sometimes get the sense that some of my senior colleagues see me as a theory-obsessed postmodernist—which, to a certain extent, is true enough; I'm also someone whose own research bleeds over into media, film and television, and popular culture. But when it comes down to it, I'm also a pretty hardcore traditionalist when it comes to teaching the canon and what I think students in English should be given as background. I'd quite cheerfully have a course teaching these six books as a requirement for our English degree if it was feasible to do so (I'd quite cheerfully teach that course too, when it comes down to it).

At any rate, I often feel as though my exhortations are falling on deaf ears, so it is quite gratifying to hear from a student—as I did this morning—that my suggestion was taken seriously. Chatting with a student as a walked from class to my office, I learned that he was knee-deep in Dante's Inferno—and enjoying it! Wonders never cease.

Monday, March 29, 2010

Big-screen versus small-screen terror

I'm currently reading a book to review called Firestorm: American Film in the Age of Terrorism. It's pretty good—a decent balance of exposition and analysis, and makes some very useful observations about the often serendipitous relationship between Hollywood and 9/11. One observation it makes, which sort of struck me, is that since the attacks nine years ago, there has been very little in the way of terror-based plots emerging from Hollywood. Certainly when compared to the fairly constant presence of terrorists as ubiquitous bad guys through the 80s and 90s, the absence is rather striking. The author quotes an article written in The Atlantic in 2008 by conservative columnist Ross Douthat, who claims "Even in films that aren't taking thinly veiled jabs at the Bush administration, terrorist baddies turn out to be Eurotrash arms dealers (2006's Casino Royale), disgruntled hackers (2007's Live Free or Die Hard), a sinister air marshal (2005's Flightplan), or the handsome white guy sitting next to you in the airport lounge (2005's Red Eye). Anyone and anybody, in other words, except the sort of people who actually attacked the United States on 9/11." I think Douthat's claim here is inane,1 for reasons I will get to in a moment, but at least has the merit of being correct in the broad strokes: namely, that there have been very few films since 9/11 that have depicted Islamist extremists in any capacity. Those that have I can more or less count on one hand: The Kingdom, United 93, Munich and Syriana are really the only ones that leap to mind (though I suppose you could include You Don't Mess With the Zohan if you wanted to stretch the designation to the breaking point, and Iron Man features bad guys in Afghanistan whom we are meant to understand as either Al-Qaeda or Taliban types).

To clarify what I mean here: there are a host of films that feature terrorism as a narrative element, or which are obviously about 9/11 in displaced form (think of Spielberg's remake of War of the Worlds), but these are not about radical Islamism per se; I also leave off the list films about the Iraq war such as The Hurt Locker, Stop-Loss or In the Valley of Elah. I find Douthat's claim insipid because, of the films he cites, only Live Free or Die Hard can reasonably be called a film about terrorism, and even there the antagonist's principal motivation is to demonstrate to his former employer, Homeland Security, how vulnerable the U.S. is to electronic assault. The other three are, respectively, an overwrought spy film, a really rather incoherent psychological thriller, and a more nuanced and taut psychological thriller. Given that the latter two involve airplanes, I can see the impulse to classify them as Douthat does, but that stretches the definition of "terrorism" to an extent that renders is more or less meaningless.

I would counter Douthat's claim, which attempts to posit this lack of "the sort of people who actually attacked the United States on 9/11" in films as Hollywood being all wimpily PC, with the observation that there have been very few films about terrorism in general since 9/11.2 It's not that Hollywood is reluctant to depict Islamists (Allah knows they were the among the most frequent go-to villains in action films in the 80s and 90s), but that it has become leery of the very subject of terrorism. This is, in and of itself, unsurprising. For all it gets vilified as a bastion of liberalism, Hollywood is actually very conservative—not politically, perhaps, but it is very cautious where the financial bottom line is concerned. Whether or not big-budget, genre movies—and this, really, is what we're principally talking about, not small-scale, independent or self-consciously "quality" films—could sustain audiences if they actively reminded them of the war or terror is an interesting question, but not one that major studios are likely to test at the risk of their profit margins.

All of this, besides offering me an excuse to take a few thwacks at Ross Douthat, is by way of pointing out that while films have been reluctant to depict terrorism since 9/11, Islamist or otherwise, the same cannot be said for television. Indeed, the small screen has been very nearly glutted with narratives of Samuel Huntington's so-called "clash of civilizations," and pretty much every American actor who can pass for "Eastern" has been getting work on 24. Even Kal Penn of Harold and Kumar and House fame has taken a turn as terrorist (albeit a reluctant one). Kal Penn, I ask you! Also that season, Canada's own Sean Majumder played a terrorist as well, casting that is perhaps even more amusing than Penn's.

While 24 is the primary example of TerrorTVTM, it is by no means the only one: besides such short-lived shows as E-Ring, Threat Matrix, and The Grid, N.C.I.S. has frequent recourse to plotlines involving Islamists; season three of The West Wing began with a special post-9/11 episode ("Isaac and Ishmael") in which the characters mused over the nature of the war on terror while in a lockdown, and the second half of the season hinged to a large extent on a plot in which the president must decide whether to assassinate a foreign dignitary who is discovered to be the leader of an Al-Qaeda type group; Showtime produced a two-season series titled Sleeper Cell, about an African-American Muslim working for the FBI who infiltrates, well, a sleeper cell; and the various Law & Order franchises have all featured episodes about terrorism, Islamism, or the new American security state.

I raise this question here because I am curious as to why this is the case—I get why film shies away, but what is it about terrorism, Islamist or otherwise, that television finds so amenable? Is it the small-screen format? Where film is leery of recreating the kind of spectacle of 9/11, is television unconcerned because it can't do large-scale visuals in the same way? Are terrorism plots (in the various senses of the word) better suited to episodic format or procedural dramas? Or is there another reason?

I throw this out to general discussion—I am interested to get feedback. Also, I am sure I am missing terrorism-based films in my short list. If you want to point out anything I have overlooked, that would be super.

————————————————————————

1I suppose I should be kinder to Ross Douthat, given that he is one of a rapidly shrinking cohort of conservative writers and thinkers who actually value intellect and erudition, and overall the article of his I'm quoting offers an interesting reading of post-9/11 film as a self-conscious retread of 1970s Hollywood. His discussions are almost invariably marred however by such tendentious claims as the one he makes in the quoted passage. He pretty much always manages to seriously irritate me whenever I read his columns, less by his politics than the presumptions about leftist and liberal thought that underwrite his arguments. (One way or another, I am deeply amused that Microsoft Word's spellcheck suggests that the word I'm meaning to type when I write "Douthat" is actually "douched.")

2Douthat, incidentally, does not do much to help his argument when he dismisses one of the few post-9/11 films to specifically depict Islamist terrorism—Syriana—as a film that "eschews nuance entirely, tracing all the ills of Mesopotamia to a malign nexus of Texas oilmen, neocons, and a trigger-happy CIA." Omitted from this list, interestingly, are such middle-Eastern regimes as Saudi Arabia, whose systemic maltreatment of migrant workers provides the primary site and source of religious extremism in the film.

Wednesday, March 24, 2010

Simply embarrassing, continued ...

For a while there, Ann Coulter was off the radar. Considering what a fixture she had been on Fox News and elsewhere as a Bush/Cheney cheerleader, it was odd that once Obama was elected she sort of faded into the background. My pet theory is that the rise of Glenn Beck's particular brand of batshit insanity stole her thunder: beside Beck's running-off-madly-in-all-directions paranoia, his weepy exhortations, and his trademark chalkboard, Coulter's schtick loses much of its edge and becomes merely an increasingly familiar set of snarky snipes at liberals.

No wonder she's embarked on a Canadian tour.

I love my country, but damn can we be a thin-skinned bunch. As if Francis Houle's email wasn't enough to stoke the Coulter furnace of righteous indignation, her appearance at the University of Ottawa last night had to be cancelled because of "boisterous demonstrations outside that sponsors of the appearance feared could turn violent."

This is appalling on several levels. The first, as I argued in yesterday's post, is that it plays perfectly into Coulter's hands: it allows her to portray herself as a martyr to free speech, and to paint liberals and leftists as thugs and hypocrites who only value freedom of expression if it agrees with their own views.

Second, it makes for the galling truth that—on this point, at least—she is correct. After watching the drama of the health care debate in the U.S. last summer, in which Tea Partiers disrupted town hall meetings by drowning out any possibility for actual discussion, sought to intimidate their opponents by coming armed, and vandalized Democratic district offices, numerous voices on the left rightfully decried such thuggish tactics. Canadians, who are supposed to be the more moderate and reasonable nation, just helped their political kin south of the border cede some of their moral high ground.

Third, the principle of free speech is easy if those speaking are only ever those who you agree with. The true test comes when you're obliged to give time to those who infuriate you. Protesters are U of Ottawa? FAILED.

Finally, it is deeply ironic that this all happened more or less at the same time as Republican obstructionism, fear-mongering, mendacity, and extremist rhetoric lost a huge battle as health care reform passed the House of Representatives. Since Obama's election, the American right embraced a strategy of demonizing the new administration and went all in against health care reform legislation on the assumption that killing it would mortally cripple the president and his agenda. Emphatically in their corner were (and are) the Coulter coterie—Glenn Beck, Rush Limbaugh, Mark Levin, Sean Hannity, Sarah Palin, Karl Rove, Bill O'Reilly, and all the rest of the crew at Fox News. Arguably, the bullhorns wielded by these bloviators are what has kept the GOP so resolutely hewing to their all-or-nothing strategy—any time any Republican has attempted to distance him or herself from the Obama-is-antichrist (or Hitler, or Stalin, or grandma-killer) line or critique the extremist view, they have been firmly whipped back into place. The threat of Tea Party primary challenges alone has been a serious check on any Republican bipartisan tendencies.

The point is that they lost, and lost big. In what has quickly become his most-quoted words since he (apparently) penned the phrase "Axis of Evil" as a Bush speechwriter, David Frum laments that the GOP has allowed itself to become so influenced by the Rush Limbaugh crowd:

We followed the most radical voices in the party and the movement, and they led us to abject and irreversible defeat.

There were leaders who knew better, who would have liked to deal. But they were trapped. Conservative talkers on Fox and talk radio had whipped the Republican voting base into such a frenzy that deal-making was rendered impossible. How do you negotiate with somebody who wants to murder your grandmother? Or – more exactly – with somebody whom your voters have been persuaded to believe wants to murder their grandmother? ... Talk radio thrives on confrontation and recrimination. When Rush Limbaugh said that he wanted President Obama to fail, he was intelligently explaining his own interests. What he omitted to say – but what is equally true – is that he also wants Republicans to fail. If Republicans succeed – if they govern successfully in office and negotiate attractive compromises out of office – Rush's listeners get less angry. And if they are less angry, they listen to the radio less, and hear fewer ads for Sleepnumber beds.

All this is by way of saying that over the next few months, as the moderate nature (and indeed Republican nature) of the new health care bill becomes apparent, all the Coulter-style dishonesty infecting conservative discourse about the Obama presidency will become more apparent.

Again, I reiterate my point from my last post: Coulter's speech may be hateful, but it isn't hate speech. And even if it was, censoring or shutting it down only gives it credence. Practically nothing she says stands up to even halfhearted analysis and critique, and her principal rhetorical strategy is to misdirect by attack and insult. She's best ignored, and if she's invited to give a talk at our universities, don't give her the satisfaction of letting her know you care.

Tuesday, March 23, 2010

Simply embarrassing

Everyone's favourite conservative agent provocateur, Ann Coulter, is doing a Canadian tour this week, and yesterday made her first stop at my alma mater UWO. Kristen told me that the lineup for her talk was out the door and down the street, which makes me wonder: are there that many conservatives at Western? are people just excited at the chance to see an American celebrity? or are those numbers swelled by media studies students attending so they can shout her down? (I'm guessing all of the above).

What's "embarrassing" however is not that my former school and others has invited Coulter, but that Francis Houle, VP Academic and Provost of U of Ottawa felt compelled to send her an email warning her that hate speech in Canada is defined differently than in the U.S.:

I would, however, like to inform you, or perhaps remind you, that our domestic laws, both provincial and federal, delineate freedom of expression (or "free speech") in a manner that is somewhat different than the approach taken in the United States. I therefore encourage you to educate yourself, if need be, as to what is acceptable in Canada and to do so before your planned visit here.

You will realize that Canadian law puts reasonable limits on the freedom of expression. For example, promoting hatred against any identifiable group would not only be considered inappropriate, but could in fact lead to criminal charges. Outside of the criminal realm, Canadian defamation laws also limit freedom of expression and may differ somewhat from those to which you are accustomed. I therefore ask you, while you are a guest on our campus, to weigh your words with respect and civility in mind.

This is cringe-inducing, painful, crawl-under-a-rock embarrassing—not least because it plays so perfectly into Coulter's hands that I'm sure she did a little dance of joy when she received it. And, entirely predictably, she ended her talk at Western by saying "I will be filing a complaint with the human rights commission" over the e-mail from U of Ottawa, and said she was "a victim of hate crime ... Either what [Houle] did to me is a hate crime or that whole commission is B.S."

The whole point of Coulter's schtick is to push people's buttons, and ideally to get them to respond as Francis Houle did; the only proper response is to ignore her. Threatening her with prosecution under hate speech laws—besides being inane—merely throws fuel on her fire and encourages her to ramp things up. At the Western talk, she suggested a Muslim student "take a camel"; she said that "In America everybody wants to be black. The feminists want to be black, the illegal aliens want to be black, the gays want to be black"; "There are only two things gay men can't do. Number one, get married to each other. Number two, throw a baseball without looking like a girl."

Obnoxious? Yes. Hateful? Yes. Hate speech? Hardly. But she would love nothing more than to have someone bring charges against her.

Monday, March 22, 2010

What a brilliant idea

One of the great benefits of being an English professor is that I get carte blanche to be as much of a luddite as I want. There is very little about this job that forces technology on you. I use it, of course—I love my laptop and all of the benefits of online databases for research, to say nothing of doing library searches from my office so I know whether I even need to walk the few hundred feet to enter the stacks. And I find email the easiest way to communicate with students outside of class and office hours.

On the other hand, I resisted getting a cell phone until the first time I drove from St. John's to Ontario, largely as a contingency against breaking down on a desolate stretch of the TCH. I got the cheapest phone with the most minimal plan however, for the simple reason that I never use it. In fact, its principal function is to serve as my timepiece, as my watch was stolen out of the gym change room a year ago, and I haven't bothered to buy a new one.

This sort of aberrant behaviour puts me firmly in the minority these days, I know—and I am never more aware of that fact than on the numerous times a week (sometimes a day) when someone to whom I'm speaking suddenly retrieves their phone from their pocket to read an incoming text.

While I have adopted Facebook and instant messaging on my computer, texting and tweeting remains beyond me. I suspect that this might be one of those things that would change entirely were I to get a Blackberry or iPhone, but for the moment I remain a cheerful and resolute non-texter. And like all selective luddites, I tend toward high levels of irritation when someone interrupts a conversation with me to text someone or, even worse, answer their phone and proceed to have a lengthy conversation. But then, I've always been someone who gets similarly irritated when I am over at someone's house and they do the same thing with their home phone. When I have people over, I either get off the phone quickly or let it go to message.

Seriously. I've never understood why phone conversations trump actual people's presence.

A friend of mine conducted an experiment one Christmas. He was buying someone some perfume as a gift, and the lineup at the cash was huge. So he called the store, got put through to the fragrances department, and asked to purchase the perfume in question over the phone with his credit card. The cashier who answered him interrupted her dealings with her current customer, and rang in his sale. My friend then went and did some more shopping, and fifteen minutes later walked up to the cash, ignoring the line, and said "Hi—I called about the perfume?" He got his package and walked out of the store, noting that the person who was at the end of the line when he made the call still had not been served.

Now, to be fair, I don't know how the cashier might have acted differently, though if it had been me I think I would have at least put my friend on hold until I could finish with the person I was dealing with.

At any rate, my luddite hackles get raised most frequently these days when I see students texting in class. Some are able to be discreet, but most don't really seem to care one way or another—and it is common enough that calling every student out for doing it becomes a little disruptive, and I start to feel shrill.

Which is why this article in Slate magazine seems to me to be an excellent compromise. One of the problems with snowballing technology is that social conventions and etiquette takes time to catch up. What is proper etiquette for texting? The Slate article seems to me to be eminently reasonable in suggesting that "If you're in a situation where you'd excuse yourself to go to the bathroom, you should also excuse yourself before reaching for your phone. Otherwise, go ahead without asking. Either way, don't play with your phone longer than you'd stay in the bathroom." Similarly, if you find yourself in a movie theatre or a classroom and you absolutely need to send or receive a text, leave the room. I have what I think is a pretty reasonable standard for classroom behaviour, which is that I consider my students adults (which, of course, they are); you can come and go from my classroom as you please if you need to use the washroom. In the same vein, if students are whispering to each other, I encourage them to go find somewhere in the hallway to have their conversation; if they are yawning hugely or falling asleep, I suggest that they might go buy a coffee. In the end, my students are paying for the privilege to be there, so I don't begrudge them the freedom to walk out. I do however get upset if their behaviour potentially distracts or bothers other tuition-paying students … and from what I've gleaned from a surprising number of students, texting in-class pisses off more people in the classroom than me.

Thursday, March 18, 2010

Comments for this post have been disabled (no, not really)

I sometimes think fondly of that naïve time back when the InterWebs were the New Big Thing. Remember that innocent age? Bill Clinton was was a loyal and faithful husband still; Jean Chretien was set to be Prime Minister For Life; some cheerful lads from Seattle taught us about the aesthetic pleasures of secondhand flannel shirts and crushing nihilism; some people predicted that the digital economy would result in the end of work; it seemed as though any basement-dwelling slacker with a smattering of HTML could pilot a dot com startup and make millions; the words "tech" and "bubble" were never contiguous; and, most gloriously, the newly-connected world of the World Wide Web (a term people stilled used unironically in the same breath as "information superhighway") was going to lead us to a golden age of democratized information and knowledge that would free us from the constraints of our bodies and allow us to commune with the minds of others in the Digital Utopia.

Remember that time? Really, we should have paid more attention to Mike Godwin.

The Mike Godwin, that is, who formulated Godwin's Law. This brilliant insight, originally made in 1990, is as follows: "As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1." Though I didn't know it at the time, this was the response I wanted to have for the various Digital New Agers and New Gnostics who saw the Internet as the Promised Land. It seemed too good to be true: whatever the democratic vistas were offered by the wired world, it was inevitable that human nature would assert itself.

Godwin saw it. What it is about comment threads that bring out the crazies? A brief perusal of YouTube comment threads is often enough to smother my faith in humanity with a pillow. I used to read the comics online, but anything more political than Ziggy would inspire ranting, ad hominem attacks, and of course the requisite comparison of one's opponent to Hitler in the comments section (a comments section for a comics web site? Why?). When I found I was reading the comments more than the comics (and therefore wasting more time than the thirty seconds it took to read Doonesbury, Non Sequitur, and Bloom County reruns), I stopped altogether.

Comment threads are however hard to avoid, and can suck you in before you even realize you've just spent twenty minutes reading the argument between DemonGirl23 and jerseyboy about Obama's birth certificate. And what's appalling is that, more often than not, what sucks you in is not the intelligence and insightfulness of the debate, but its extreme and toxic rhetoric. In other words, that twenty minutes you just wasted is not mitigated by having learned something; it is genuinely wasted time in which any optimism you might have had about the human beings' basically decent nature has been thoroughly dashed.

What did I read recently to inspire this post, you ask? The conservative blogger and pundit Michael Medved wrote a post yesterday on TownHall.com that calmly and carefully explained why conservatives need to back away from the conspiracy theories that the Obama Administration is deliberately trying to destroy the U.S. economy so they can impose a communist regime. Medved, it should be noted, is as right-wing as they come; his post is not a defense of the President, but rather a rebuke to the hysterical right to stop being delusional. Some figures on the right, notably David Frum, have been making similar arguments, and have largely been given grief for their attempts to be reasonable (and when Medved and Frum are the voices of reason on the right, you know we've gone down the rabbit hole).

Rather predictably, the comments thread for Medved's post was pretty virulent. I saw exactly one response that could be construed as agreeing with him. If you have the stomach for it, you should check it out, but here are some highlights:

"You ... cannot come to grips with the reality of how radical Obama is. All one must do is listen to what he says. He wants to remake this country in his image and to do that he must first destroy the old country we have known for 200+ years."

"Medved is a hack and an Obama supporter, in disguise."

"[Y]ou are deluding yourself if you think Obama is pure-of-heart and that his differences with the majority of Americans can be attributed simply to ideology. He is a radical departure from past presidents, both in his intentions and methods.

"Medved has been corrupted by his Hollywood friends. There is no logic in his argument, derived from Sean Penn like nuttiness."

"Communism got routed in the Cold War, but the Reds in the west, and there are many, didn't just crawl under a rock and die. They came up with bulldung like global warming so that they could impose things like cap and trade."

"[Obama] like every extreme socialist liberal, hates America as our founders intended to be; he thinks his failed socialistic ideas are superior, and he is doing everything he can to implement them."

"The most significant, plain, simple truth here in answer to Medved's naïve question is Obama is at best of Muslim heritage. There cannot be a sane person alive who will argue that point ... Obama is a Muslim – period."

"Whats it going to take to convince you, Medved that Obama is at least a Socialist and probably a Marxist? A hammer and sickle waving over the White House? Just remember this; the German people never thought Hitler and his cronies would do the things they did."

Aaaaaaaaaaaaaaand ... there it is. It was when I got to the last one here that I snapped out of my reverie and stopped reading. Good times.

(All that being said, I do encourage comments).

Monday, March 15, 2010

Jefferson v. Board of Education


In the last few days, all the left-leaning political blogs I read have had stories about how the Texas Board of Education has recently made changes to the Social Studies curriculum to make it conform to a social and religious conservative perspective. This is not new: since the beginning of the year, the board has made over one hundred changes in this respect. Ten of the fifteen people on the board comprise a solid conservative bloc that has introduced and ratified such changes as dropping the suggestion that Japanese internment in WWII was motivated in part by racism; teaching that the constitution does not enact a separation of church and state; introducing a defense of McCarthyism because "there were some communists who were discovered"; teaching that sexual identity, eating disorders and rape are a matter of "choice"; and requiring that students learn about "positive" things like "Phyllis Schlafly, the Contract With America, the Heritage Foundation, the Moral Majority and the National Rifle Association."

One of the changes that went through most recently is at once the most puzzling and most revealing. The headline at Think Progress reads "Texas Board of Education Cuts Thomas Jefferson out of its Textbooks." Upon further investigation, what was "cut" was a reference to Jefferson in the context of Enlightenment thought. The original standard in the social studies curriculum was to "explain the impact of Enlightenment ideas from John Locke, Thomas Hobbes, Voltaire, Charles de Montesquieu, Jean Jacques Rousseau, and Thomas Jefferson on political revolutions from 1750 to the present." The amended now directs teachers to "explain the impact of the writings of John Locke, Thomas Hobbes, Voltaire, Charles de Montesquieu, Jean Jacques Rousseau, Thomas Aquinas, John Calvin and Sir William Blackstone."

So while I have to assume that Think Progress is being a bit misleading and that Thomas Jefferson still resides in social studies textbooks in some form, this change is nevertheless interesting and wants parsing. For one thing, though the amended version seems more or less the same as the original, minus Jefferson and plus Calvin, Aquinas and Blackstone. However, what is significant is what is what else is elided in the revision: namely, references to the Enlightenment and political revolution.

That conservatives would seek to throw Jefferson out of the boat is odd, to say the least. The Founding Fathers of the U.S. constitution, especially the resonant names of Washington, Adams, Jefferson, Franklin and Madison, tend reliably to make their way into American political argument on both sides of the ideological coin to lend rhetorical weight; liberals and conservatives and everyone in between remake the Founders in their own image again and again for the simple reason that they stand as a key touchstone of American political discourse signifying all that is positive and exceptional about the U.S.

Hence, it is rather puzzling that the conservative faction on the Texas Board of Education would elide Jefferson rather than refashion him in a way that lets them claim him as one of their own. A New York Times article about these changes suggests that Jefferson was not popular among the conservative on the board as he was the one responsible for the phrase "separation of church and state"—and indeed, much energy seems to have been specifically devoted to erasure of that idea. One board member declared that "I reject the notion by the left of a constitutional separation of church and state," and a subsequent amendment introduced to explore how "the founding fathers protected religious freedom in America by barring the government from promoting or disfavoring any particular religion above all others" was firmly voted down.

However, the rejection of Jefferson in favour of Aquinas et al has a more pernicious element than the church and state falsehood being peddled. The facile reason offered for this change is that Jefferson was himself merely influenced by the other philosophers listed, as if he offered no substance of his own but merely parroted his predecessors. Someone with even a casual familiarity with late-eighteenth century history and philosophy however will know that Jefferson, Thomas Paine, and the American Revolution more generally had a seismic impact on European politics—Jefferson was not merely a conduit for other thinkers, but was a brilliant philosopher in his own right whose words and ideas resonated through the storming of the Bastille and the tumultuous revolutionary spirit of nineteenth-century Europe. More importantly however, Jefferson was very much a product of his time, a product of the fundamentally secular Age of Reason. The rewriting of the above educational standard is, more than anything else, a rewriting of the specific historical context that made the Declaration of Independence, the U.S. Constitution and Bill of Rights, and the United States itself possible. Locke, Hobbes, Rousseau, Voltaire, Montesquieu, and Jefferson comprise a continuum of thought that broke from the religious traditions informing Aquinas and Calvin. The rewritten standard not only drops the Enlightenment down the memory hole, it tacitly suggests continuity where there is breach.

The substitution of Blackstone for Jefferson is a particularly deft move—Blackstone, a prominent British judge in the mid-eighteenth century, wrote Commentaries on the Laws of England. This treatise was effectively the legal standard in pre-Revolutionary America and which is still frequently invoked by the U.S. Supreme Court as a resource for understanding the legal and intellectual contexts of the nation's founding. Again however, this substitution suggests false continuity, for in the rewriting of the standard we lose reference to revolution, American or otherwise. Further, rather than citing a genuinely revolutionary thinker like Jefferson, we instead have a conservative jurist writing in and about a nation without any formal separation of church and state.

It is odd that the American Revolution itself would be effectively deleted from the standard by a group of vocal conservatives, especially at a moment when so much of the American right seeks to identify itself with the revolutionary fervour of the Boston Tea Party, and Jefferson's admonition about the Tree of Liberty being watered by the blood of patriots and tyrants adorns many a tee shirt and hand-lettered sign at rallies. I suppose one could sigh and shrug and chalk it up to the increasing incoherence of the U.S. right wing today, but it is more deeply troubling when that incoherence finds its way off the 24/7 news cycle and into the pages of curricula.

Of course, my reading of the change to the educational standard grants a subtlety of thought that those responsible for it probably don't deserve. To wit, faction leader Dr. Dan McLeroy (the title of "doctor," I discovered while researching this, refers to his degree in dentistry) said of his methodology in shaping curriculum, "We are a Christian nation founded on Christian principles. The way I evaluate history textbooks is first I see how they cover Christianity and Israel. Then I see how they treat Ronald Reagan—he needs to get credit for saving the world from communism and for the good economy over the last 20 years because he lowered taxes." Presumably McLeroy means the economy up to but not including autumn 2008 and since; and the last time I checked, Reagan might have preached the gospel of lower taxes, but after his initial cut in 1981, systematically raised taxes for pretty much the rest of his tenure. That I would like to see pointed out in Texas social studies textbooks, but I suppose I shouldn't hold my breath.

Monday, March 08, 2010

On not watching the Oscars



I did not watch the Academy Awards this year. Actually, my Oscars viewing has been pretty sporadic since I moved to St. John's, largely because of the one and a half hour time shift from Ontario, which means if I'm going to commit to watching the whole thing, I'm up to an ungodly hour. It has also been sporadic because so has my film viewing, at least of films that the Academy tends to favour. Of the ten best picture nominees this year (ten! am I alone in thinking that doubling the nominees was an idiotic thing to do?) I'd seen exactly half—Inglourious Basterds, Up in the Air, Up, District 9 and The Hurt Locker. Which, actually, is a better showing than I've had in previous years.

In past years I've gotten a charge out of watching the Oscars, as I think most people do—enjoying the self-indulgent pageantry, the excess, the self-importance of it all, to say nothing of the charade that these were genuinely the best films that had been made in the previous year. It's like mental junk food, and as such I particularly enjoy it when there's a snarky host like Jon Stewart who looks so uncomfortable in his skin to be at the center of all this navel-gazing (I think I'm the sole person in the world who enjoyed Letterman's turn at hosting so many years ago).

I flipped on the television just as the red carpet arrivals started to be broadcast, and surprised myself with the my visceral reaction to it all. For some reason this year I had very little tolerance for the sheer excess on display; I was, to put it bluntly, fairly revolted by it all. I don't know why: as already mentioned, I normally quite enjoy the spectacle. But this year, for whatever reason, I found it distasteful. Perhaps it's that such displays seem in bad taste during times of economic distress; perhaps I'm just feeling the first faint stirrings of my inner curmudgeon, which will blossom as I age and result in me evolving into a stereotypical old fart complaining about kids today and yelling at them to get off my lawn.

Whatever the reason, my heart or my shoes, I flipped off the TV in vague disgust, but was nevertheless quite pleased to discover this morning that The Hurt Locker had beaten out Avatar in almost every category in which they were both nominated (the exceptions being cinematography, which Avatar took, and original score, which they both lost to Up). I was pleased on several levels, the first and most basic being the pleasure one takes in watching the underdog trounce the odds-on favourite—especially when the underdog is a small-scale, thoughtful, well-made film, and the odds-on favourite a massively budgeted behemoth that pairs up spectacular visual effects with a reductive and cliché-ridden narrative.

I should add the caveat that I have not seen Avatar; I have therefore not been dazzled by its visual landscape, to whose innovation and imagination I will in absentia cede the praise it undoubtedly deserves. But until James Cameron pairs his virtuoso cinematography with a good script, I will always be glad to see him beaten out for academy awards by films like The Hurt Locker. Especially by films like The Hurt Locker.

There is a lot to recommend Kathryn Bigelow's film about the Iraq War, from its cinematography that manages to be at once desolate and claustrophobic, to the almost unbearable tension of the bomb defusal sequences, to the way it captures military life as tedium punctuated by terror. What I found most striking however, and most subtly communicated, is the way the film indicts the Iraq War as a conflict marked by a complete and utter lack of any kind of collective national sacrifice.

To back up a bit: in the run-up to the Academy Awards, there has been some interesting discussion of The Hurt Locker in the political blogosphere and in newspapers. The New York Times has been running a thoughtful series of columns on the rendering of war in film; a number of Iraq War veterans have held forth on the film's inaccuracies versus its successes and failures in communicating the general spirit or sense of the war. One fairly common critique (variously positive or negative, depending on the reviewer's perspective) is that the film does not offer a political message about the war, being wholly consumed with the psychology of the soldiers. This I find somewhat odd; the political message is there, but is not overt or explicit, and really only comes into focus in a specific scene.

The scene comes when the main character, played very well by Jeremy Renner, has been rotated home after his tour. He is back home with his wife and young son, and caught up in a series of standard domestic tasks—making dinner, cleaning leaf-choked gutters, grocery shopping. At first, this sequence feels like pretty standard fish-out-of-water fare, the soldier who has difficulty readapting to home life after the stresses and traumas of the battlefield. The key moment comes however when grocery shopping, when his wife asks him to go back and grab a box of cereal. Confronted by a vast selection of breakfast cereal—taking up the entire aisle—he is momentarily at a bit of a loss, staring at the colourful wall of boxes for some time before finally choosing one at random.

It is a beautifully evocative moment that highlights a cruel truth about the prosecution of the Iraq War: that it was fought solely by the soldiers sent there, and demanded no collective sacrifice of the U.S. citizenry. The Iraq War is unique in this respect, in the way it has proceeded in a sort of out-of-sight-out-of-mind manner, which is not to suggest that there has been no media coverage. Rather, it has been orchestrated in such a way that has placed no demands on anyone but the soldiers fighting it and their families. The war effort has been a huge contributor to the current U.S. deficit, as the Bush Administration made no attempt to pay for it by either raising taxes or cutting spending (indeed, part of the swelling deficit under Obama is an optical illusion brought about by the fact that his administration now includes the cost of the war in the visible budget, something Bush never did).

The Iraq War is sui generis in the last hundred years of American warfare in this respect. All of the major conflicts of the twentieth century were, to a certain extent, experienced collectively by the nation, through conscription, rationing, the demand for volunteerism, the selling of war bonds to pay for the war effort, as well as the often propagandistic effort to include the nation as a whole in the war's narrative. This collectivization could result in national solidarity, such as in the Second World War, or in a national argument, as with Vietnam. Even the first Gulf War was a collective experience, however facile—the spectacle of the five weeks of bombing followed by a short, swift ground assault gave the home audiences the illusion of having some skin in the game.

Aside from the initial attack in spring 2003 (capped by the "Mission Accomplished" photo op), the current Iraq war has made minimal demands on American attention and wallets—by design. As with the immediate aftermath of 9/11, the duty of the people at home has been to consume, to extend credit to do so, not to sacrifice. The cereal aisle facing Jeremy Renner's scarred soldier appears in The Hurt Locker as the film's greatest obscenity. Which, now that I think of it, might have been in the back of my mind when I turned off the red carpet spectacle in disgust last night.

Monday, March 01, 2010

Still alive, just distracted by the Olympics

Been away from the blog for a while ... I've got a couple of posts in draft form, getting ready to go, but today's return is really all about one thing.