Thursday, March 18, 2010

Comments for this post have been disabled (no, not really)

I sometimes think fondly of that naïve time back when the InterWebs were the New Big Thing. Remember that innocent age? Bill Clinton was was a loyal and faithful husband still; Jean Chretien was set to be Prime Minister For Life; some cheerful lads from Seattle taught us about the aesthetic pleasures of secondhand flannel shirts and crushing nihilism; some people predicted that the digital economy would result in the end of work; it seemed as though any basement-dwelling slacker with a smattering of HTML could pilot a dot com startup and make millions; the words "tech" and "bubble" were never contiguous; and, most gloriously, the newly-connected world of the World Wide Web (a term people stilled used unironically in the same breath as "information superhighway") was going to lead us to a golden age of democratized information and knowledge that would free us from the constraints of our bodies and allow us to commune with the minds of others in the Digital Utopia.

Remember that time? Really, we should have paid more attention to Mike Godwin.

The Mike Godwin, that is, who formulated Godwin's Law. This brilliant insight, originally made in 1990, is as follows: "As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1." Though I didn't know it at the time, this was the response I wanted to have for the various Digital New Agers and New Gnostics who saw the Internet as the Promised Land. It seemed too good to be true: whatever the democratic vistas were offered by the wired world, it was inevitable that human nature would assert itself.

Godwin saw it. What it is about comment threads that bring out the crazies? A brief perusal of YouTube comment threads is often enough to smother my faith in humanity with a pillow. I used to read the comics online, but anything more political than Ziggy would inspire ranting, ad hominem attacks, and of course the requisite comparison of one's opponent to Hitler in the comments section (a comments section for a comics web site? Why?). When I found I was reading the comments more than the comics (and therefore wasting more time than the thirty seconds it took to read Doonesbury, Non Sequitur, and Bloom County reruns), I stopped altogether.

Comment threads are however hard to avoid, and can suck you in before you even realize you've just spent twenty minutes reading the argument between DemonGirl23 and jerseyboy about Obama's birth certificate. And what's appalling is that, more often than not, what sucks you in is not the intelligence and insightfulness of the debate, but its extreme and toxic rhetoric. In other words, that twenty minutes you just wasted is not mitigated by having learned something; it is genuinely wasted time in which any optimism you might have had about the human beings' basically decent nature has been thoroughly dashed.

What did I read recently to inspire this post, you ask? The conservative blogger and pundit Michael Medved wrote a post yesterday on TownHall.com that calmly and carefully explained why conservatives need to back away from the conspiracy theories that the Obama Administration is deliberately trying to destroy the U.S. economy so they can impose a communist regime. Medved, it should be noted, is as right-wing as they come; his post is not a defense of the President, but rather a rebuke to the hysterical right to stop being delusional. Some figures on the right, notably David Frum, have been making similar arguments, and have largely been given grief for their attempts to be reasonable (and when Medved and Frum are the voices of reason on the right, you know we've gone down the rabbit hole).

Rather predictably, the comments thread for Medved's post was pretty virulent. I saw exactly one response that could be construed as agreeing with him. If you have the stomach for it, you should check it out, but here are some highlights:

"You ... cannot come to grips with the reality of how radical Obama is. All one must do is listen to what he says. He wants to remake this country in his image and to do that he must first destroy the old country we have known for 200+ years."

"Medved is a hack and an Obama supporter, in disguise."

"[Y]ou are deluding yourself if you think Obama is pure-of-heart and that his differences with the majority of Americans can be attributed simply to ideology. He is a radical departure from past presidents, both in his intentions and methods.

"Medved has been corrupted by his Hollywood friends. There is no logic in his argument, derived from Sean Penn like nuttiness."

"Communism got routed in the Cold War, but the Reds in the west, and there are many, didn't just crawl under a rock and die. They came up with bulldung like global warming so that they could impose things like cap and trade."

"[Obama] like every extreme socialist liberal, hates America as our founders intended to be; he thinks his failed socialistic ideas are superior, and he is doing everything he can to implement them."

"The most significant, plain, simple truth here in answer to Medved's naïve question is Obama is at best of Muslim heritage. There cannot be a sane person alive who will argue that point ... Obama is a Muslim – period."

"Whats it going to take to convince you, Medved that Obama is at least a Socialist and probably a Marxist? A hammer and sickle waving over the White House? Just remember this; the German people never thought Hitler and his cronies would do the things they did."

Aaaaaaaaaaaaaaand ... there it is. It was when I got to the last one here that I snapped out of my reverie and stopped reading. Good times.

(All that being said, I do encourage comments).

Monday, March 15, 2010

Jefferson v. Board of Education


In the last few days, all the left-leaning political blogs I read have had stories about how the Texas Board of Education has recently made changes to the Social Studies curriculum to make it conform to a social and religious conservative perspective. This is not new: since the beginning of the year, the board has made over one hundred changes in this respect. Ten of the fifteen people on the board comprise a solid conservative bloc that has introduced and ratified such changes as dropping the suggestion that Japanese internment in WWII was motivated in part by racism; teaching that the constitution does not enact a separation of church and state; introducing a defense of McCarthyism because "there were some communists who were discovered"; teaching that sexual identity, eating disorders and rape are a matter of "choice"; and requiring that students learn about "positive" things like "Phyllis Schlafly, the Contract With America, the Heritage Foundation, the Moral Majority and the National Rifle Association."

One of the changes that went through most recently is at once the most puzzling and most revealing. The headline at Think Progress reads "Texas Board of Education Cuts Thomas Jefferson out of its Textbooks." Upon further investigation, what was "cut" was a reference to Jefferson in the context of Enlightenment thought. The original standard in the social studies curriculum was to "explain the impact of Enlightenment ideas from John Locke, Thomas Hobbes, Voltaire, Charles de Montesquieu, Jean Jacques Rousseau, and Thomas Jefferson on political revolutions from 1750 to the present." The amended now directs teachers to "explain the impact of the writings of John Locke, Thomas Hobbes, Voltaire, Charles de Montesquieu, Jean Jacques Rousseau, Thomas Aquinas, John Calvin and Sir William Blackstone."

So while I have to assume that Think Progress is being a bit misleading and that Thomas Jefferson still resides in social studies textbooks in some form, this change is nevertheless interesting and wants parsing. For one thing, though the amended version seems more or less the same as the original, minus Jefferson and plus Calvin, Aquinas and Blackstone. However, what is significant is what is what else is elided in the revision: namely, references to the Enlightenment and political revolution.

That conservatives would seek to throw Jefferson out of the boat is odd, to say the least. The Founding Fathers of the U.S. constitution, especially the resonant names of Washington, Adams, Jefferson, Franklin and Madison, tend reliably to make their way into American political argument on both sides of the ideological coin to lend rhetorical weight; liberals and conservatives and everyone in between remake the Founders in their own image again and again for the simple reason that they stand as a key touchstone of American political discourse signifying all that is positive and exceptional about the U.S.

Hence, it is rather puzzling that the conservative faction on the Texas Board of Education would elide Jefferson rather than refashion him in a way that lets them claim him as one of their own. A New York Times article about these changes suggests that Jefferson was not popular among the conservative on the board as he was the one responsible for the phrase "separation of church and state"—and indeed, much energy seems to have been specifically devoted to erasure of that idea. One board member declared that "I reject the notion by the left of a constitutional separation of church and state," and a subsequent amendment introduced to explore how "the founding fathers protected religious freedom in America by barring the government from promoting or disfavoring any particular religion above all others" was firmly voted down.

However, the rejection of Jefferson in favour of Aquinas et al has a more pernicious element than the church and state falsehood being peddled. The facile reason offered for this change is that Jefferson was himself merely influenced by the other philosophers listed, as if he offered no substance of his own but merely parroted his predecessors. Someone with even a casual familiarity with late-eighteenth century history and philosophy however will know that Jefferson, Thomas Paine, and the American Revolution more generally had a seismic impact on European politics—Jefferson was not merely a conduit for other thinkers, but was a brilliant philosopher in his own right whose words and ideas resonated through the storming of the Bastille and the tumultuous revolutionary spirit of nineteenth-century Europe. More importantly however, Jefferson was very much a product of his time, a product of the fundamentally secular Age of Reason. The rewriting of the above educational standard is, more than anything else, a rewriting of the specific historical context that made the Declaration of Independence, the U.S. Constitution and Bill of Rights, and the United States itself possible. Locke, Hobbes, Rousseau, Voltaire, Montesquieu, and Jefferson comprise a continuum of thought that broke from the religious traditions informing Aquinas and Calvin. The rewritten standard not only drops the Enlightenment down the memory hole, it tacitly suggests continuity where there is breach.

The substitution of Blackstone for Jefferson is a particularly deft move—Blackstone, a prominent British judge in the mid-eighteenth century, wrote Commentaries on the Laws of England. This treatise was effectively the legal standard in pre-Revolutionary America and which is still frequently invoked by the U.S. Supreme Court as a resource for understanding the legal and intellectual contexts of the nation's founding. Again however, this substitution suggests false continuity, for in the rewriting of the standard we lose reference to revolution, American or otherwise. Further, rather than citing a genuinely revolutionary thinker like Jefferson, we instead have a conservative jurist writing in and about a nation without any formal separation of church and state.

It is odd that the American Revolution itself would be effectively deleted from the standard by a group of vocal conservatives, especially at a moment when so much of the American right seeks to identify itself with the revolutionary fervour of the Boston Tea Party, and Jefferson's admonition about the Tree of Liberty being watered by the blood of patriots and tyrants adorns many a tee shirt and hand-lettered sign at rallies. I suppose one could sigh and shrug and chalk it up to the increasing incoherence of the U.S. right wing today, but it is more deeply troubling when that incoherence finds its way off the 24/7 news cycle and into the pages of curricula.

Of course, my reading of the change to the educational standard grants a subtlety of thought that those responsible for it probably don't deserve. To wit, faction leader Dr. Dan McLeroy (the title of "doctor," I discovered while researching this, refers to his degree in dentistry) said of his methodology in shaping curriculum, "We are a Christian nation founded on Christian principles. The way I evaluate history textbooks is first I see how they cover Christianity and Israel. Then I see how they treat Ronald Reagan—he needs to get credit for saving the world from communism and for the good economy over the last 20 years because he lowered taxes." Presumably McLeroy means the economy up to but not including autumn 2008 and since; and the last time I checked, Reagan might have preached the gospel of lower taxes, but after his initial cut in 1981, systematically raised taxes for pretty much the rest of his tenure. That I would like to see pointed out in Texas social studies textbooks, but I suppose I shouldn't hold my breath.

Monday, March 08, 2010

On not watching the Oscars



I did not watch the Academy Awards this year. Actually, my Oscars viewing has been pretty sporadic since I moved to St. John's, largely because of the one and a half hour time shift from Ontario, which means if I'm going to commit to watching the whole thing, I'm up to an ungodly hour. It has also been sporadic because so has my film viewing, at least of films that the Academy tends to favour. Of the ten best picture nominees this year (ten! am I alone in thinking that doubling the nominees was an idiotic thing to do?) I'd seen exactly half—Inglourious Basterds, Up in the Air, Up, District 9 and The Hurt Locker. Which, actually, is a better showing than I've had in previous years.

In past years I've gotten a charge out of watching the Oscars, as I think most people do—enjoying the self-indulgent pageantry, the excess, the self-importance of it all, to say nothing of the charade that these were genuinely the best films that had been made in the previous year. It's like mental junk food, and as such I particularly enjoy it when there's a snarky host like Jon Stewart who looks so uncomfortable in his skin to be at the center of all this navel-gazing (I think I'm the sole person in the world who enjoyed Letterman's turn at hosting so many years ago).

I flipped on the television just as the red carpet arrivals started to be broadcast, and surprised myself with the my visceral reaction to it all. For some reason this year I had very little tolerance for the sheer excess on display; I was, to put it bluntly, fairly revolted by it all. I don't know why: as already mentioned, I normally quite enjoy the spectacle. But this year, for whatever reason, I found it distasteful. Perhaps it's that such displays seem in bad taste during times of economic distress; perhaps I'm just feeling the first faint stirrings of my inner curmudgeon, which will blossom as I age and result in me evolving into a stereotypical old fart complaining about kids today and yelling at them to get off my lawn.

Whatever the reason, my heart or my shoes, I flipped off the TV in vague disgust, but was nevertheless quite pleased to discover this morning that The Hurt Locker had beaten out Avatar in almost every category in which they were both nominated (the exceptions being cinematography, which Avatar took, and original score, which they both lost to Up). I was pleased on several levels, the first and most basic being the pleasure one takes in watching the underdog trounce the odds-on favourite—especially when the underdog is a small-scale, thoughtful, well-made film, and the odds-on favourite a massively budgeted behemoth that pairs up spectacular visual effects with a reductive and cliché-ridden narrative.

I should add the caveat that I have not seen Avatar; I have therefore not been dazzled by its visual landscape, to whose innovation and imagination I will in absentia cede the praise it undoubtedly deserves. But until James Cameron pairs his virtuoso cinematography with a good script, I will always be glad to see him beaten out for academy awards by films like The Hurt Locker. Especially by films like The Hurt Locker.

There is a lot to recommend Kathryn Bigelow's film about the Iraq War, from its cinematography that manages to be at once desolate and claustrophobic, to the almost unbearable tension of the bomb defusal sequences, to the way it captures military life as tedium punctuated by terror. What I found most striking however, and most subtly communicated, is the way the film indicts the Iraq War as a conflict marked by a complete and utter lack of any kind of collective national sacrifice.

To back up a bit: in the run-up to the Academy Awards, there has been some interesting discussion of The Hurt Locker in the political blogosphere and in newspapers. The New York Times has been running a thoughtful series of columns on the rendering of war in film; a number of Iraq War veterans have held forth on the film's inaccuracies versus its successes and failures in communicating the general spirit or sense of the war. One fairly common critique (variously positive or negative, depending on the reviewer's perspective) is that the film does not offer a political message about the war, being wholly consumed with the psychology of the soldiers. This I find somewhat odd; the political message is there, but is not overt or explicit, and really only comes into focus in a specific scene.

The scene comes when the main character, played very well by Jeremy Renner, has been rotated home after his tour. He is back home with his wife and young son, and caught up in a series of standard domestic tasks—making dinner, cleaning leaf-choked gutters, grocery shopping. At first, this sequence feels like pretty standard fish-out-of-water fare, the soldier who has difficulty readapting to home life after the stresses and traumas of the battlefield. The key moment comes however when grocery shopping, when his wife asks him to go back and grab a box of cereal. Confronted by a vast selection of breakfast cereal—taking up the entire aisle—he is momentarily at a bit of a loss, staring at the colourful wall of boxes for some time before finally choosing one at random.

It is a beautifully evocative moment that highlights a cruel truth about the prosecution of the Iraq War: that it was fought solely by the soldiers sent there, and demanded no collective sacrifice of the U.S. citizenry. The Iraq War is unique in this respect, in the way it has proceeded in a sort of out-of-sight-out-of-mind manner, which is not to suggest that there has been no media coverage. Rather, it has been orchestrated in such a way that has placed no demands on anyone but the soldiers fighting it and their families. The war effort has been a huge contributor to the current U.S. deficit, as the Bush Administration made no attempt to pay for it by either raising taxes or cutting spending (indeed, part of the swelling deficit under Obama is an optical illusion brought about by the fact that his administration now includes the cost of the war in the visible budget, something Bush never did).

The Iraq War is sui generis in the last hundred years of American warfare in this respect. All of the major conflicts of the twentieth century were, to a certain extent, experienced collectively by the nation, through conscription, rationing, the demand for volunteerism, the selling of war bonds to pay for the war effort, as well as the often propagandistic effort to include the nation as a whole in the war's narrative. This collectivization could result in national solidarity, such as in the Second World War, or in a national argument, as with Vietnam. Even the first Gulf War was a collective experience, however facile—the spectacle of the five weeks of bombing followed by a short, swift ground assault gave the home audiences the illusion of having some skin in the game.

Aside from the initial attack in spring 2003 (capped by the "Mission Accomplished" photo op), the current Iraq war has made minimal demands on American attention and wallets—by design. As with the immediate aftermath of 9/11, the duty of the people at home has been to consume, to extend credit to do so, not to sacrifice. The cereal aisle facing Jeremy Renner's scarred soldier appears in The Hurt Locker as the film's greatest obscenity. Which, now that I think of it, might have been in the back of my mind when I turned off the red carpet spectacle in disgust last night.

Monday, March 01, 2010

Still alive, just distracted by the Olympics

Been away from the blog for a while ... I've got a couple of posts in draft form, getting ready to go, but today's return is really all about one thing.

Thursday, February 11, 2010

Like a train wreck, I can’t look away ...

God love Rachel Maddow. Need any more evidence for the hypocrisy of the current Republican party? Watch:



I hope Obama watches this.

Wednesday, February 10, 2010

Putting “alternative lifestyle” to bed (so to speak)


I have been following two fairly momentous events in gay rights south of the border—the trial in California challenging "Proposition Eight"—the California injunction passed in the last election against gay marriage—and the Senate hearings on the deeply flawed Clinton measure "Don't Ask, Don't Tell," which prohibits gay servicemen and –women from being open about their sexuality while in the military. Both of these events have been notable for the way in which the anti-gay factions in government and society have been forced into increasingly untenable positions to justify their prohibitions, as the standard lines about the dangers of homosexuality to the institutions of marriage and the military have their illogical and indeed irrational bases exposed.

I will not rehash the various arguments being thrown back and forth (some summaries here and here); what's got me thinking about this today is the pernicious quality of the term "alternative lifestyle," which has become synonymous with "gay." I've always disliked the term, both for its implied decadence, and for the suggestion that homosexuality—or any sexuality—is somehow a choice. It's been some time since I've argued this question with someone (one of the benefits/drawbacks of the university life is a radical reduction in the number of people with socially conservative perspectives you encounter on a regular basis), but the idea that individuals "choose" homosexuality is completely at odds with common sense. Let's be clear here: I'm talking not about experimentation or private fantasy, or those fortunate enough to find themselves in accepting and open families and communities. If one's sexuality were genuinely a matter of choice, gays and lesbians would be exclusive to urban and progressive enclaves. To put it another way: I went to a Catholic high school in the late 1980s, and that school remains the most overtly homophobic milieu I have ever experienced; teachers could hold forth on the sinfulness of homosexuality with impunity, and opine that AIDS was God's punishment; one teacher characterized gay sex as "dirty and furtive anonymous encounters in gas station bathrooms." (I hasten to add that this is not a sweeping characterization of the entire teaching staff, but of a very vocal minority, who nevertheless set the broader tone as a loud few often can. Many of my teachers were open-minded and generous in temperament and character, and bore, in hindsight, obvious antipathy to the handful of religious bigots).

If there was a group of people less tolerant than the teachers, it was the student body (or large segments of it, at any rate). The greatest insult was to be labelled a fag; the easiest way to destroy someone's reputation was to spread rumours he or she was gay; homophobic language was common, as was the suggestion of violence where gays were concerned.

When arguing against the suggestion that homosexuality is a choice, I have always related my own high school experience, and asked who, in the midst of this homophobic nightmare, would actively choose to be vilified, ostracized, and possibly subjected to violence? And yet I know of more than a few people with whom I went to high school who came out of the closet years later. I can only imagine the private hells they endured, and the courage it took to finally overcome the years in which their ostensible perfidy was preached at them.

Hence, when I see or hear the phrase "alternative lifestyle," it irks me. More properly, "alternative lifestyle" should really mean any lifestyle choice that deviates from the norm, and here as well the concept is deeply problematic. The question is: alternative to what? If the LGTB community and the prospect of gay marriage finds itself getting increasing cultural traction, I do think it has as much to do as anything else with the concomitant awareness that the idea of "normality" is itself something of a fallacy. "Alternative" family structures involving divorced parents, unmarried parents, multiple households, stepchildren and stepsiblings, have been eroding the myth of the Cleavers and their white picket fencery for some time. I'm not the first to wonder why those vociferous defenders of marriage are so upset about gays wanting to wed—shouldn't they be welcoming any movement toward re-establishing marriage as desirable in the face of all these other "alternative" arrangements? Shouldn't they be happy that these putative sexual degenerates want to embrace commitment, monogamy, and the comfortable tedium of married life?

These questions, of course, are entirely disingenuous, but offer an insight into why "alternative lifestyle" has become effectively synonymous with "gay." That anti-gay proponents frame gayness as a "lifestyle choice" betrays their own particular bigotry; that is, their resentment that anyone would lead a lifestyle alternative to their own. To argue from a cultural studies perspective, the term more appropriate is "subculture." The images animating anti-gay discourse—those who, for example, claim that gays serving opening in the military will lead to cross-dressing, body art, and overt display of "exotic forms of sexual expression"—are more or less the same images conservative newspapers splash across their front pages the day after a Gay Pride March. Anyone who has celebrated Gay Pride knows this frustration: no images of parents marching in solidarity with their gay children (or vice versa), no images evoking the general sense of joy or celebration that pervades the event, no images of people dressed innocuously; rather, we are treated to a slew of those images most calibrated to shock staid sensibilities. Not that this is in itself a bad thing (after all, the sixty-year old men in the assless chaps and the transgendered float in homage to Priscilla, Queen of the Desert likely want to shock staid sensibilities); but what gets lost in translation is that this is the pageantry of a specific subculture (or rather, a diverse set of subcultures) that has developed in defiance of exactly the kind of cultural forces determined to prevent gay marriage in California and the repeal of DADT in the U.S. military. It is not a lifestyle, any more than the denizens of New Orleans exchange beads for casual nudity 365 days a year.

One of the frequent analogies made to the repeal of DADT is that of Truman's desegregation of the military in 1948. Similar arguments were made then, about unit cohesion and the disruptive effects of having black and white soldiers serving side by each. This, I think, is a useful comparison to make, especially considering that the political forces organized against anything resembling civil rights legislation in the 50s and 60s were far more pervasive and vociferous than the outliers denouncing homosexuality today—and the integration of the U.S. armed forces was a slow and arduous process that paralleled the integration of the country more generally. Then as now however the greatest fears tended to be stoked by that catalyst that always inspires fear—the unknown. The average personal journey from prejudice to understanding only really happens through experience. I say "understanding" rather than "acceptance," because the latter has a quality of condescension. One arrives rather at an understanding of the other, ideally, through familiarity. This was indeed my own experience: I won't be disingenuous and suggest that I was somehow above the homophobic milieu of my high school, was more enlightened and progressive. I was a product of that context as much as anyone else. But when one of my best friends came out to me, I had to make a choice. It wasn't easy, but that friendship was more important to me than my own fears and misgivings.

It is for this reason that I look at the repeal of DADT as a crucial milestone—whatever the brave stands being taken by the higher command against congressional anti-gay forces, I have little doubt that the broader portion of the rank and file is at least ambivalent and at most hostile to openly-serving gays and lesbians—just as they were to integrating the forces in the 50s. But both history and personal experience tell me that the first step in social progress is simple contact, which forces people to confront prejudices and, more often than not, teaches them that common humanity outweighs what differentiates them.

It's worth noting that the original formulation of the policy was "Don't Ask, Don't Tell, Don't Flaunt"—which echoes the typical "I'm not homophobic, but ..." disclaimer that states "I don't care what you do in your own life, just don't flaunt it in front of me." Of course, one person's flaunting is another person's simple living. This is the pernicious aspect of DADT, for it requires silence about one of the crucial defining aspects of self. Andrew Sullivan, in his blog "The Daily Dish" at The Atlantic online, responds to those who dismiss the difficulty of concealing one's sexual orientation with this simple thought experiment: "If you're straight, try it for one day. Try never mentioning your spouse, your family, your home, your girlfriend or boyfriend to anyone you know or work with – just for one day. Take that photo off your desk at work, change the pronoun you use for your spouse to the opposite gender, guard everything you might say or do so that no one could know you're straight, shut the door in your office if you have a personal conversation if it might come up. Try it. Now imagine doing it for a lifetime. It's crippling; it warps your mind; it destroys your self-esteem."

Seriously. Imagine it. Think you could pull that off?

Monday, February 08, 2010

A post to embarrass my father

So, the Super Bowl was yesterday, and apparently one team beat another. One team did more of the thing that they were supposed to do and took home a trophy of some description. Well done. Well done, I say!

(Make you a dead, Dad -- I'll take real football seriously when you finally get around to watching the DVDs of season one of Friday Night Lights I loaned you in November).

And on the topic of fictional football -- Slate imagined what the Super Bowl would be like if it were directed by a bunch of cinema auteurs. Watch:

Wednesday, January 27, 2010

Weekly Wisdom


"If I had a large amount of money I should certainly found a hospital for those whose grip upon the world is so tenuous that they can be severely offended by words and phrases and yet remain all unoffended by the injustice, violence and oppression that howls daily about our ears."

--Stephen Fry

Monday, January 25, 2010

This month in democracy


All things being equal, I should probably be feeling a lot more deflated and cynical vis à vis democracy and its discontents these days than I am. Three big blows to the democratic process have been landed, two in the U.S. and one here: (1) Stephen Harper’s prorogation of parliament, (2) the U.S. Senate’s stagnation and obstructionism, and finally (3) the recent decision by SCOTUS to reverse a century of legislation restricting corporate political contributions, effectively opening the floodgates for corporate money to influence and/or buy elections and elected officials.

I have watched the special Massachusetts election of Scott Brown, which took away the Democrats’ supermajority, and the ensuing commentary with bemusement. The Village Voice captured the absurdity most succinctly with their headline, “Scott Brown Wins Mass. Race, Giving GOP 41-59 Majority in the Senate.” If ever there was a moment in which the American electoral system was showing its flaws, surely this is it—the pros and cons of the current health care reform bills notwithstanding, surely an eighteen vote advantage should be enough to pass legislation? Apparently not with the threat of the filibuster hanging over it.

The U.S. Senate, it should be pointed out, is a pretty undemocratic body to begin with. It was originally designed to be the “sober second thought” in the crafting of legislation, with senators having six year terms as opposed to congressmen’s two ostensibly leading to a more stable, mature consideration of proposed bills. The senators were also fewer in number, and—most importantly—not determined by the populations they represented. Rather, each state gets two senators, regardless of size. This was less of a problem back in the late eighteenth century when there were only thirteen states, which did not have vast discrepancies in population; however, when the people of California (population thirty-six million) have the same representation in the Senate as Vermont (population six hundred thousand), the influence small states wield is wildly asymmetrical.

But that’s neither here or there—the filibuster has become an increasingly common tool for blocking legislation, to the point where, in practice, an obstructionist minority only interested in preventing votes (such as we now see in action) can grind the wheels of the legislative branch to a halt. Employment of the filibuster has increased more than fivefold—between 1951 and 1960, it was used an average of 3.2 times a year, whereas between 1981 and 2004 the yearly average was 16.5. (For a good breakdown of the procedural issues, read here and here).

The filibuster has a quasi-romantic quality in the popular imagination—from its very name, which means “freebooter” or “pirate,” to Jimmy Stewart’s Mr. Smith, to the West Wing episode “The Stackhouse Filibuster”—connoting a lone heroic individual standing up against the system. And indeed, certain instances of individuals filibustering do conform to that idea (though not always on the side of the angels, as with Strom Thurmond’s attempt to block the Civil Rights Act). More and more however it has become standard operating procedure for the minority party to thwart the majority agenda. While its use has traditionally been that of last resort, it has become business as usual.

I can’t help but see a parallel to Stephen Harper’s use of prorogation, insofar as it too is an arcane parliamentary procedure of last resort. The argument could be made that last year’s invocation of it was just that, a desperate manoeuvre to circumvent an undemocratic power grab by the opposition (not an argument I agree with, but it could be made nevertheless); no such claim could me made however about the decision to prorogue parliament this past December 30. This time, it was so blatantly and baldly a move to avoid having to address the Afghani detainee scandal until after the Winter Olympics, by which time (Harper would hope) the public’s attention would be elsewhere.

The use of prorogation twice in the space of a year is worrisome, not least because it would seem to set a new precedent for the PM’s autocratic powers, and redefine the relationship between the PM and parliament. For me, Stephen Harper’s most troubling quality has always been his obvious desire to arrogate presidential power to the PMO and overturn the Westminster standard of “first among equals.” The high-handed use of prorogation—both times—bespeaks both arrogance and a disdain for the checks on prime ministerial power that are a cornerstone of parliamentary democracy (to say nothing of the disrespect to the Governor General, and of all the bills before the House of Commons that now die).

What keeps me buoyed in the face of these events however is that it puts these governmental flaws on people’s radar. Prorogation Part Two would certainly seem, at this point, to have backfired for Harper. If he banked on Canadians’ apathy to see him through, the early signs aren’t good for him: his poll numbers have dropped, and people across the country are speaking up angrily. Nor have I heard anything in the way of support for Harper’s actions—the right seems more or less mum on the subject, which is a tacit admission that the PM was way out of line on this one. I have reason to hope that this may have been one bridge too far for Harper.

Similarly, there is also some real discourse in the U.S. happening about the filibuster, and the Senate’s arcane procedures more generally. I have less hope that anything will happen on that front than I am about prorogation being Harper’s Waterloo, but it is heartening. One way or another, Obama seems to be fired up: he’s taking a more oppositional and populist tone than he has since the election, and he has re-hired his campaign director David Plouffe to chart a new course. Hopefully, after a year of being conciliatory, he’s pissed off. The State of the Union should be a barn-burner.

Finally, the Supreme Court’s ruling last week taking the restrictions of corporate political donations doesn’t have me as outraged as I might have imagined. A large part of my calm was summed up nicely by Glenn Greenwald, who observes that any lament about the certain corporate interference in American government necessarily suggests that this is not already the case:

The reality is that our political institutions are already completely beholden to and controlled by large corporate interests (Dick Durbin: "banks own" the Congress). Corporations find endless ways to circumvent current restrictions—their armies of PACs, lobbyists, media control, and revolving-door rewards flood Washington and currently ensure their stranglehold—and while this decision will make things marginally worse, I can't imagine how it could worsen fundamentally. (Salon.com)

Sadly, I can’t disagree with Greenwald’s argument. Furthermore, in looking closely at the case and the decision, one finds that there are in fact some significant First Amendment issues that would have made the reverse ruling problematic from a free speech perspective. That the court decided to overturn a century of precedent on the funding of elections strikes me as something of a baby/bathwater situation, but it is fairly clear that there was no easy extrication from this case one way or another.

On the other hand, the ruling has evoked a storm of condemnation on both sides of the political coin, and may ironically do more to draw attention to the need to reform the way election campaigns are funded than any previous endeavours.

So, to recap: the U.S. Senate is broken, but so visibly so it has excited serious discussion about how to fix it. Stephen Harper has prorogued parliament in autocratic and arrogant fashion, and in the process ignited a grassroots protest and sent his numbers into a tailspin. And America’s Supreme Court has formalized corporate ownership of elected officials, which may well lead to a bipartisan effort to scale back the excesses of soft money. Of course, none of these eventualities may pan out—I sit here with fingers crossed. But if one or more of them do, it would be a vindication of the principle that sometimes things have to get worse before they can get better.

Wednesday, January 20, 2010

Speaking of fantasy ...

OK, so the story of Robin Hood isn’t fantasy per se, insofar as it lacks magic and magical beasts, and besides which is ostensibly historical ... but it certainly satisfies in the same way that fantasy writing does—i.e. it is medieval in its setting and sensibility, it employs such fantasy staples as castles and knights and swordfights, and it above all provides imaginative escape from such onerous modern trappings as indoor plumbing, antibiotics, and regular bathing.

I mention this because when I found the Clash of the Titans trailer on YouTube for my previous post, number one in the “related videos” listing was the trailer for the new re-booting of the Robin Hood legend—starring Russell Crowe as the man himself, Cate Blanchett (sigh) as Maid Marion, and directed by Ridley Scott. One of the good things about having Scott as the director is that even if the film totally sucks (what was he thinking with G.I. Jane?), it is going to look amazing.

But from a local perspective, the new Robin Hood is going to be an EVENT, because the guy playing the musician Allan A’Dayle (the giant chicken in the Disney version) is none other than Alan Doyle, aka the lead singer of Great Big Sea. Whose name, seriously, is already so close to the character he’s playing that I’m already thinking of him as Alan A’Doyle. As it falls out, Doyle is, like, totally BFFs with Russell Crowe. Who knew (well, besides everyone in St. John’s)?

As with Clash of the Titans, the Scott/Crowe Robin Hood has the advantage of a low bar. Because, seriously: just how many snafus and fuck-ups during the writing, casting, and producing of a film about Robin Hood would you have to have to make it worse than Robin Hood: Prince of Thieves?

And another advantage of having Alan Doyle involved? It radically lowers the chances of having Bryan Adams on the soundtrack.

Monday, January 18, 2010

Return to Olympus

In the Interesting Trends In Upcoming Films category is an apparent resurgence in interest in Greek mythology. Well ... I say “resurgence,” but really it’s just two films, which is nevertheless two films more than we’ve seen in some time (since the god-awful Troy in 2004, at any rate).

Behind door number one we have a remake of Clash of the Titans, with a pretty impressive cast of players—and impressive not so much for its “bankable” stars as for the fact that they’re all really good actors. Liam Neeson plays Zeus, Ralph Fiennes is Hades, Danny Huston is Poseidon, Polly Walker (Atia from Rome) is Cassiopeia, and Alexa Davalos (yes, please!) plays Andromeda. Sam Worthington plays Perseus, and the only thing I know about him is that he’s the main guy in Avatar (a film that seems less and less likely I will go see the more I read about it). Also in the film, based on my exhaustive thirty-second internet research, are Peter Postlewaite, Gemma Atterton and Jayson Fleming. So they’ve managed to assemble a pretty promising team.

They’ve also definitively updated the special effects. Remember the kitschy stop-motion animation of the first one?



I have to say it, because it is a timeless truth: giant scorpions are cool. Especially when they sting the ground in time to the soundtrack’s rhythm.

I suspect this film will be a great mindless pleasure—I get to see some of my favourite actors chewing the scenery, in a story where I don’t have to worry about subtlety or nuance. That was what ruined Troy for me—if you’re going to adapt the Iliad to film—and leave out the gods, no less!—have an eye to the story’s details. But no. Fortunately for Clash of the Titans, the original was so cheesy and bad, they can only go up from there.

The second film is adapted from a series of young adult novels in which a teenager named Percy Jackson discovers that he is the son of Poseidon and gets embroiled in the affairs of the gods (along with a cadre of other offspring of the Greek Pantheon). My question is: where were these novels when I was twelve?

Again, the film boasts a pretty impressive cast, with Uma Thurman (Medusa), Rosario Dawson (Persephone), Pierce Brosnan (Chiron), Catherine Keener (Percy’s mom), Sean Bean (Zeus), and the awesome awesome Kevin McKidd as Poseidon.

(Can I just observe as an aside that I think Sean Bean has a weakness for wearing breastplates and/or chain mail? Seriously: Odysseus in Troy, Boromir in The Fellowship of the Ring, Zeus, and Ned Stark in the upcoming adaptation of A Game of Thrones on HBO. Does his agent put these scripts in the “automatic yes” pile?)

So, Percy Jackson and the Olympians is a little glitzier in its casting, but still pretty solid. And it looks pretty good too, and doesn’t seem like it’s taking itself too seriously:



I’ve been idly thinking lately about a trend in which science fiction is waning and fantasy is waxing—at least, that seems to be the consensus on a handful of literary blogs I follow, confirmed by book sale numbers. I did a public lecture last term about the conservative and regressive tendencies of fantasy as a genre (one of the many things I neglected to blog about), and will be rehashing that it in a guest lecture for our Masters in Humanities program in February. That fantasy is trending up and SF down is an interesting phenomenon at the present moment, and these Olympian films would seem (on the surface) to correspond to this tendency.

My thoughts on this are still somewhat inchoate, and I’ll return to this topic in the future. One suggestion I will make however is that part of what we’re seeing is a disenchantment with technology—not so much in nihilistic terms, such as marked the post-WWI generation, but in terms almost of boredom. I think there’s a certain sense that technology has caught up to, and indeed in some ways surpassed, the imagination of the future. The golden age of SF up through the 60s and 70s was largely enamoured of technology and science’s potential, something epitomized by Star Trek’s utopian vision; what we’re seeing now, however, is increasingly dystopian or militarized SF (think the new Battlestar Galactica), as well as an interesting trend in “literary” fiction to appropriate the trappings of dystopian SF while stubbornly resisting the label (such as Margaret Atwood’s latest offerings, and The Road by Cormac McCarthy).

Again, these are just early thoughts. Does anyone else have any ideas about this trend?

Thursday, January 14, 2010

The best television series of the aughts

The end of the decade kind of surprised me—in the sudden welter of top ten lists for movies and music and what have you, I suddenly realized that ten years had elapsed since we celebrated the end of the millennium. And in spite of my love of lists, I find myself ill-equipped to compose many of the usual suspects: I have seen not nearly enough film in the last ten years to do a best films list; I suppose I could do a best fiction list, but it would be woefully incomplete, absent the dozens of novels that were published that I did not read (ironically, because I was reading too much); and my musical tastes sort of calcified in the mid-late 90s.

On the other hand, I do watch a lot of teevee, and have in the last two years made it something of an academic sidebar. The other day my television guru friend Jen apologized on her blog for not yet having gotten to her own best of the decade, and I thought, “AHA! A list I can do!” This is indeed something I have been thinking a lot about, though not in these exact terms. So here follows my top ten list for the decade for which we have yet to devise a handy moniker (I’m holding out for the “aughts,” if for no other reason than one day being an old-timer who can reminisce about the “back in aught-seven, when I was working in the salt mines ...”).

Of course, the question of what makes a television series “good” is a vexed one at best. I have little doubt my list will furrow a lot of foreheads, so here are my criteria: I am concerned in this particular list with television that broke the mould, that challenged viewers and resisted the typical formulaic pitfalls of an episodic structure, and above all else was intelligent. More importantly, unapologetically intelligent: series that did not feel compelled to explicate their more complex elements or confuse knowledge with subtlety (such as Chris Carter was so often guilty of on The X-Files).

In this respect, the aughts offered an embarrassment of riches: we saw in the past decade the rise and indeed renaissance of exceptionally well-written, well-acted, and well-produced television. HBO was the epicentre of this phenomenon, but not its exclusive purveyor: other specialty cable networks like AMC, Showtime and FX followed the Home Box Office’s lead, as did a few adventurous network forays. Further, I think critics will see the aughts as a time when serious actors decamped from Hollywood films and took on roles they could sink their teeth into over several seasons of well-written scripts: actors like Ian McShane, Glenn Close, Edward James Olmos, Mary-Louise Parker, Martin Sheen, Steve Buscemi all appeared in deeply nuanced and complex roles; we saw late-career renaissances in Ted Danson, Alec Baldwin and Henry Winkler, and breakout performances from the likes of Jon Hamm.

So, without further ado ...


10. 30 Rock

I think I can safely say there has never been a television show more certain make me laugh hysterically and often. Tina Fey shows us she’s got way more game than a bit player on SNL and Sarah Palin impersonator: each episode is unpredictable, madcap, and usually absurd, while nevertheless making us care pretty deeply about the characters involved. What I think I love most about Tina Fey is that she is smart enough to subordinate her own ego to the show: Liz Lemon is the geeky and pathetic calm at the center of the absurdist storm, and Fey is happy to let every scene her alter ego is in get stolen by each and every one of the crazy characters she has created. Each episode is a gem, but I think my favourite was the Amadeus parody that came at the end of season 2, in which Tracy Jordan plays the pornographer Mozart to Frank Rossitano’s Salieri. Also, no comment about 30 Rock would be complete without a shout out to the comic brilliance that is Alec Baldwin as Jack Donaghy.

9. The West Wing

As much as I love this show, I was hesitant about including it, as it has—on reflection—more of a 90s feel to me, and the balance of its seasons that appeared in the aughts, from 2003-2006 were the post-Aaron Sorkin version of the show. Starting season five, The West Wing lost something crucial with Sorkin’s departure. It maintained the high seriousness, the machine-gun dialogue, the unapologetically wonkish preoccupation with the inner workings of the executive branch ... but it lost the energy of Sorkin’s brand of dialogue, and with that, the humour that animated so much of the first four seasons. However much the series depicted the life-and-death drama that happens on a daily basis in the White House, it always did so with extraordinary humour, so that Charlie could give Mrs. Landingham a hard time over her new car or CJ could get emotionally attached to a turkey, even as the President deals with the drama of a besieged embassy. And that is why it makes the list, the final three seasons notwithstanding.

8. The Sopranos

Oz was the first dramatic series that defined the new approach HBO was taking; The Sopranos demonstrated that it had legs. The series was unapologetic in its profanity, its violence, its corrupt and unlikable characters, and above all, the inescapable parallels between the mafia hierarchy and “legitimate” capitalism. None of this was new, of course—mob films had been doing it since Public Enemy. What was different was that this was television—and the finite story arc of the gangster’s meteoric rise and inevitable fall that defined the genre was thrown out the window as the Soprano clan struggled with the ups and downs of affluent suburban family life over seven seasons. The bizarre series finale that pissed so many people off I thought was genius; the sudden cut to black with no resolution whatsoever was the last nail the series drove into the mob genre’s coffin, suggesting that there was no tidy and poetic resolution—no Tony Montana gunned down, no Henry Hill relegated to symbolic death in suburban wasteland, no Michael Corleone dying pathetic and alone. Nope. Tony and his family muddle along, and there is no poetic justice for anyone.

7. Lost

Few shows divide opinion like Lost: those who love it, love it with the white-hot intensity of a thousand suns (and may well issue a fatwa on me for relegating it to number seven), or hate it with equal intensity.


My own sense of the series is that J.J. Abrams &co. totally did not expect it to make it past season one, even if it got that far. They offered this amazing and bizarre setup, and then were taken a bit aback when it became a hit (or at least, this is how I imagine it happened), and then suddenly had to devise storylines that, every episode, raised more questions and offered no answers. This, predictably, became a little lame by season three, and this is when I lost interest. On the urging of certain Losties however, I started tuning in again for season four, and am now irrevocably snared in the narrative web again. The writers have found their stride and totally upped the ante for the series. Whatever its inconsistencies at times, the show is always very, very smart, well written, well-acted, and always willing to throw curveballs at the audience (which is synonymous, on the Fox Network, with cancellation -- a lesson I wish Joss Whedon would learn).

6. Arrested Development

This, of course, is the show that launched Michael Cera’s career, but I am willing to overlook that offense because of just how good it was (Arrested Development, I mean, not Michael Cera's career). I am still amazed that Arrested Development survived as long as it did on network television, given its quirkiness, weird characters, quasi-absurdist humour and palimpsest of in-jokes that required multiple viewings to catch them all. What’s more, the series effectively picked up the challenge thrown down when Seinfeld ended, namely, where does the sitcom as a genre go from here? Seinfeld epitomized the irony and disaffection of much of the 90s by pointing to the inescapable fact that sitcoms, structurally, are necessarily about nothing. Arrested Development picked up this thread and carried on with comically one-dimensional characters all caught up in their own narcissistic dramas. The Bluth clan broke the mould: all sitcoms are about family in one capacity or another, and the underlying unity that informs that are what make them endearing. Arrested Development had the audacity to depict the most merrily—and obliviously—dysfunctional family ever.

5. Battlestar Galactica

If someone had told me five years ago that they were going to (a) remake Battlestar Galactica and (b) that it would be one of the best television shows ever made, I’d have responded, respectively, “Good luck with that” and “Uh ... sure.” But not only did they update the 1970s kitsch-fest that was the original series with kick-ass production values and visuals—which would have been cool enough in and of itself—they also turned it into one of the most complex, nuanced sagas of loss and redemption this side of East of Eden. Unapologetically smart, beautifully written, and indeed daring in the Big Questions it poses (without offering trite or moralistic answers, ever), it raised the bar not just for SF television but for television generally.

4. Deadwood

What The Sopranos did for the mob genre and BSG did for space opera, Deadwood does for the western. The series is really about the brutal and violent origins of democracy: we follow the lawless mining camp of Deadwood as it slowly moves toward civic and municipal identity and ultimate annexation to the U.S.A., and watch the body count rise en route. Beautiful in its dirt and grit, the language of the show raises profanity to a Shakespearean level. The final season was a bit of a disappointment because it was rushed to a close—finished at three seasons rather than four because of budgetary reasons—but even at its worst moments, it was superior to pretty much everything else on the tube.

3. The Daily Show with Jon Stewart / The Colbert Report

If the aughts were the Bush/Cheney decade, we can be relieved that it was also the decade of Stewart/Colbert. It is a sad statement that the most incisive speakers of truth to power were a couple of clowns on Comedy Central, but that statement in and of itself does nothing to detract from the keen intelligence and often brutal satire of a pair of shows that thrived in a decade when satire and irony were otherwise defanged by a mendacious administration and inept media. Stewart and Colbert gave us a breath of intelligence and hope, and threw in juvenile dick jokes as well, just to remind us that they’re both really goofs at heart.



2. Mad Men

At once the antidote to simplistic nostalgia about the 50s and 60s and a suave and stylish fetishization of period, Mad Men critiques Madison Avenue culture by seducing us with the very kind of polished surfaces Sterling & Cooper strive to market. “What you call love was invented by guys like me to sell nylons,” says Don Draper dismissively, even as he realizes that his mistress and her young lover are, in fact, in love. Such is the subtlety of Jon Hamm’s acting that in that instant, in the brief tightness in his voice, we see his own futile desire for something he can’t comprehend. What do you get when a superlative salesman invents himself based on what he tells America it wants? You get Don Draper—possibly the single greatest character to appear on a television screen in this or any decade.

1. The Wire

More than any other television series ever, The Wire found itself compared in style and structure to the novel. “Dickensian” was the word often used, though to my mind this is entirely a misnomer based in the show’s depiction of urban decay and poverty; where Dickens took his readers by the hand and essentially explained everything as it unfolded, the web of individual stories making up The Wire unfolded with little or no attempt at explanation. It takes until about the third or fourth episode to have a handle on things, and at that point there’s no looking back. David Simon’s saga of the fallen city Baltimore and its struggles with drugs, violence, poverty and de facto racial segregation is not so much an allegory of America as a microcosm. Each season focused on a specific part of the city: season one was the police force versus the drug dealers; season two, union politics and the fall of the American working class; three, city hall; four, the school system; and finally in season five, the media in the form of a dying newspaper. The structures of power in “legitimate” Baltimore parallel and intersect the drug world’s hierarchies, and the money cleaves them all together like an insidious glue.



Honourable Mentions: Rome, Damages, Dexter, Weeds, Angel, Oz, The Tudors, Firefly, Gilmore Girls, Six Feet Under, True Blood, Veronica Mars, Friday Night Lights, Pushing Daisies

Top Ten Characters (and the actors who portray them), in no particular order:

Dexter Morgan (Michael C. Hall), Dexter
Lester Freamon (Clarke Peters), The Wire
Patty Hewes (Glenn Close), Damages
Benjamin Linus (Michael Emerson), Lost
Don Draper (Jon Hamm), Mad Men
Omar Little (Michael K. Willams), The Wire
Al Swearengen (Ian McShane), Deadwood
Spike (James Marsters), Angel and Buffy the Vampire Slayer
Jack Donaghy (Alec Baldwin), 30 Rock
Anne Boleyn (Natalie Dormer), The Tudors

Top ten bad guys (or are they?):
Stringer Bell (Idris Elba), The Wire
the lawyers of Wolfram & Hart, Angel
Benjamin Linus (Michael Emerson), Lost
Number Six (Tricia Helfer), Battlestar Galactica
George Hearst (Gerald McRaney), Deadwood
Arthur Frobisher (Ted Danson), Damages
Vern Schillinger (J.K. Simmons), Oz
Devon Banks (Will Arnett), 30 Rock
Felicia “Snoop” Pearson (Felicia Pearson), The Wire
Dr. Horrible (Neil Patrick Harris), Dr. Horrible’s Sing-Along Blog

Best potty-mouths:
Al Swearengen (Ian McShane), Deadwood
Debra Morgan (Jennifer Carpenter), Dexter
Tony Soprano (James Gandolfini), The Sopranos
Silvio Dante (Steve van Zandt), The Sopranos
Felicia “Snoop” Pearson (Felicia Pearson), The Wire

Too quirky (or weird) for words:
Kenneth the NBC Page (Jack McBrayer), 30 Rock
Kirk Gleason (Sean Gunn), Gilmore Girls
Dwight Schrute (Rainn Wilson), The Office
Winifred "Fred" Burkle (Amy Acker), Angel
The Doctor (Christopher Eccleston), Doctor Who
River Tam (Summer Glau), Firefly
Olive Snook (Kristin Chenoweth), Pushing Daisies

Shows that were probably in the running, but I have never watched: Breaking Bad, In Treatment, Big Love, Flight of the Conchords, Curb Your Enthusiasm

Those amazing Brits: The Office, Extras, Dead Set (best zombie TV series ever), Torchwood: Children of Earth, Doctor Who

The Auteurs: Joss Whedon (Buffy The Vampire Slayer, Angel, Firefly, Dollhouse), J.J. Abrams (Felicity, Alias, Lost, Fringe), David Simon (The Wire, The Corner, Generation Kill, Treme), Alan Ball (Six Feet Under, True Blood), David Milch (Deadwood), David Chase (The Sopranos), Aaron Sorkin (The West Wing, Studio 60 on the Sunset Strip), Amy Palladino (Gilmore Girls), Matthew Weiner (The Sopranos, Mad Men)


So ... what’s on everyone else’s list?

Friday, January 08, 2010

Just a thought

According to the radio this morning, one of the intelligence failures that allowed Umar Farouk Abdulmutallab to board the plane he nearly blew up was the U.S. State Department’s inability to determine whether or not he had a visa. He did; the State Department thought he didn’t. Why? His name was misspelled.

This is disturbing in and of itself, and is a glaring demonstration of why relying on technology rather than human intelligence and common sense gets us into trouble. But what strikes me as particularly worrisome is that if a misspelled name can lead to a terrorist slipping through the cracks, we’re doubly in trouble when the principal antagonists originate from countries that don’t share our alphabet.

Seriously. We can’t settle on how to spell al-Qaeda (al-Qaida?). If Momar Qaddafi (Moammer Khadafi? Muammar al-Gaddafi?) didn't travel everywhere with a huge tent and sexy bodyguards, he could probably board a plane for Des Moines tomorrow without being identified.

Wednesday, January 06, 2010

The government endorses my laziness

I'd been feeling guilty lately about being slack in posting, but it just occurred to me today -- I have not, in fact, been delinquent in my blogging ... I just prorogued "An Ontarian in Newfoundland" back at the end of November. So not only is my absence here explainable, it's also constitutional.

Thank you, Stephen Harper, on behalf of all Canadians, for this catch-all excuse for abject laziness. Once again, your moral example is like a beacon of light unto the country. Or a beacon of something, anyway.

Now, if I can just convince Michaelle Jean to dissolve the winter semester ...

Wednesday, November 25, 2009

The illusion of substance

A student sent me this, possibly in retribution for something—one way or another, it made my head hurt.



I have little doubt about two things here: first, that it is entirely likely there were any number of answers of substance and intelligence that didn’t make the cut because they didn’t fit the overall narrative; and two, that one could have easily cobbled together a comparable series of interviews at Obama’s election night rally in which people vapidly spouted empty rhetoric about hope and change. However, there is a third thing I have little doubt about as well, and that’s that the Palin supporters who can speak substantively to the issues are always going to be vastly outnumbered by Obama supporters who can.

I can claim this with confidence for two reasons. The first is simply the law of large numbers: whatever spin has been put on Palin’s popularity, she is actually a lot less popular than she has been made out to be; and her favourability rating is significantly lower than Obama’s (see Christopher Beam’s astute distinction between “favourability” and “job approval” here—he makes the point that Palin’s job approval rating can’t be measured because, well, she has no job). Hence, the proportion of naturally-occurring intelligence translates into higher numbers for Obama.

More importantly however, the number of Palinites able to speak substantively to policy issues will be significantly less because Palin herself has no substance. None. And it has been exasperating and baffling to read a series of op-eds this week and see otherwise intelligent people falling victim to the illusion that Palin’s largely media-driven persistence on the American stage is evidence of substance. Rex Murphy in The Globe and Mail and Maureen Dowd and Frank Rich in The New York Times all opined that people on the left (and some on the right) claiming that Palin is a “joke” need to rethink their position—that she does in fact possess political wiles and savvy well in excess of what people imagine. Dowd states that Palin “reigns over thrilled subjects thronging to a politically strategic swath of American strip malls” and that “Democrats would be foolish to write off her visceral power.” Rich claims, “Palin is far and away the most important brand in American politics after Barack Obama, and attention must be paid.”

It was however Rex Murphy’s column in this past weekend’s Globe that most dismayed me, as whatever issue I might take with some of his positions, I can usually count on him for good contrarian common sense. His column was particularly irritating because he made some spot-on observations, but framed them in assumptions about the character of Obama supporters, and brought it all to an embarrassingly (for him) wrong-headed conclusion.

He says, “It will make Obama fans perspire to hear this, but Ms. Palin has a more forceful bond with her supporters than he with his.” This observation is exactly right, but I wonder why he feels the need for the caveat—Obama supporters are well aware of Palin’s forceful connection with her base. Indeed, it is one of the key things that worries many people about Palin: the bond is not intellectual but instinctive, proceeding not from the mind but the gut, and it is a manifestation of the most troubling elements of American reactionary nativism. In this she is not, as Mr. Murphy suggests, a unique new force in American politics, but the latest of American conservative politicians (following the likes of Dick Armey, Karl Rove and Mike Huckabee) to achieve a radioactive half-life balanced somewhere between actual elected officials and pundits like Glenn Beck and Sean Hannity. She is, to coin a term, a politainer: a political actor concerned not with policy or governing but her own specific brand, and when she pronounces on policy delivers what Jon Stewart (always the voice of sanity) calls “a conservative boilerplate mad-lib … delivered as if it were the hard-earned wisdom of a life well lived.”

It is his conclusion however which is most egregious: “Ms. Palin has rare gifts and stamina enough to give them play.” Certainly, she has a keen ability to cater to in reactionary American nativism; whether that counts as a “rare gift” is doubtful. But stamina? This is the governor who quit well before her term was up, just when tanking oil prices put Alaska’s economy in a spin; who didn’t actually write her own book; and whose political pronouncements emanate from Facebook and Twitter, not exactly forums that allow complex or nuanced thought.

Yet, Mr. Murphy observes, people respond to her intuitively and viscerally, and where there’s smoke there must be fire—forgetting, presumably, that in politics where there’s smoke there’s more likely to be mirrors. The abject loathing not just from the left but from the conservative intelligentsia equates presence, apparently; he writes, “A truly dumb and witless person would not have the demure columnist David Brooks hissing dismissively, angrily in fact, on a Sunday morning talk show that Sarah Palin ‘is a joke’ … Empty vessels do not inspire such venom and fury.” With all respect, Mr. Murphy needs to take a longer look at the culture of celebrity today—empty vessels fire the popular imagination as they never have in the past, and the heat they generate is no evidence of light.

Tuesday, November 17, 2009

The weirdness of academic conferences



We just finished up last Sunday the annual Canadian Association of American Studies (CAAS) conference, my favourite academic organization. I’ve been attending CAAS for four years, and have been on the executive committee for almost that long. Last year I organized the conference and hosted it at Memorial; this year, happily, it was held in London Ontario, which meant (a) I was on very familiar ground, and (b) I got to spend time with Kristen (I’m writing this entry at her place while she growls at the Library Sciences paper she’s writing).

(I also have a post on the CAASblog about the conference here).

The annual CAAS conference is a highlight for me, in part because it always feels a bit like going home (even when it’s not actually in St. John’s or London). I am and have been a member of a variety of scholarly societies, and usually attend the Congress of the Humanities every year in May. CAAS is the pleasant antithesis of the Congress—small (this year there were about ninety papers), tight-knit, friendly, and supportive. There’s something to be said about seeing the same faces every year, especially when you can be confident that this will lead to fruitful collaborations and discussion.

CAAS is a comfort on this front, because academic conferences—especially the larger they get—can be, if not necessarily daunting, then certainly odd.

There’s always a moment at conferences, usually when I’m sitting in on a panel whose papers are all way outside my areas of expertise and interest, when I reflect on the strange beast that is the academic humanities conference. I’ve tried to explain to non-academic friends and family the ostensible purpose of usefulness of conferences, and I have on occasion defended them in principle when friends and colleagues in academia inveigh against them as wastes of time, money, and energy. The thing is, I sort of agree with both perspectives; I always look forward to conferences as a chance to travel (even when the location is no great destination), to see friends and colleagues from other universities, and, yes, to see papers and engage in discussions about the subjects being discussed.

At the same time, there are always points at which the conference experience can be excruciating—even when the topics or themes under discussion are of interest to you. It is at times like that that I wonder why we do it, usually when I’m sitting in an overly heated hotel conference room in an uncomfortable chair trying to be interested in a poorly presented paper droning on well past its allotted twenty minutes … with no indication that the panel chair will be taking steps to bring the presenter to a close. Of course, the disturbing thing at those moments is wondering how many people in my audience will be thinking that about my paper.

The fact is that at your average conference you have to kiss a lot of frogs, and realize as well that you will yourself be one of those frogs for any number of people. Which is why, in my experience, the larger the conference the more likely that feeling of futility will surface. I return to CAAS every year knowing that the conference will be more of a collaborative effort than an atomized group of people there to put in the twenty minutes necessary to get a line on a CV.

Saturday, October 31, 2009

Saturday, October 17, 2009

Chickens, as in the coming home and roosting thereof


Unsurprisingly, the saga of Rush Limbaugh’s failed attempt to buy the St. Louis Rams has made for some amusing commentary on both sides of the political coin, with those on the left barely able to contain their glee, and those on the right crying foul and proclaiming Rush a martyr to political correctness. I, of course, count myself among the first group, with the one difference being that I am making no attempt to contain my glee. Schadenfreude is fun.

What amuses me the most, and what I find most instructive about this minor but entertaining incident, is the “free speech” spin Rush and his apologists have put on it. Joseph Ziegler at Big Hollywood, speaking about the snub, stated, “I strongly believe that it also represents a seminal moment in our cultural history as well as the sad state of free speech in this country.” This kind of response is typical of a certain species of ultra-conservative commentator, flame-throwers who spew hatred and untruth and then whine that their first amendment rights are being abrogated when their comments are subjected to criticism. David Horowitz, the serially mendacious attacker of liberal bias in universities, employs this gambit quite frequently—often claiming that the fact he is not invited to speak at campuses is a form of censorship (and has nothing to do with the hefty fee he demands, or that most of those booking speakers don't tend to invite people who will then spend their lecture attacking them).

Even more bizarre was a post at the conservative blog Red State, which eulogized Rush’s failed ownership bid in language I really, really hope is meant to be facetious: “Earlier this evening, as most of you now know, one of our own, Rush Hudson Limbaugh, while taking withering fire, crashed and burned. Tonight, Rush is no longer ‘just’ a radio personality. Tonight, Rush is no longer ‘just’ a NFL owner denied. Tonight, Rush is us. And we are him.” The “attack” on Limbaugh, the poster maintained, is representative of the left’s totalitarian intentions, which “proved that they will stop at nothing to end our dreams.” As egregious as this claim is, it is topped at the end when the poster quotes, in its entirety, Pastor Martin Niemöller’s infamous poem about the Nazis:

First they came for the communists, and I did not speak out—because I was not a communist;
Then they came for the socialists, and I did not speak out—because I was not a socialist;
Then they came for the trade unionists, and I did not speak out—because I was not a trade unionist;
Then they came for the Jews, and I did not speak out—because I was not a Jew;
Then they came for me—and there was no one left to speak out for me.


Seriously? You’re seriously going to compare the failed bid of a multi-millionaire to buy a crappy NFL franchise to the Holocaust? The rhetoric has gotten way out of hand these days, and when American conservatives claim equivalency between their analogies between the left and Nazis and the left’s cries of “fascism” against the Bush administration, they’re ignoring two crucial points: (1) just because some extreme voices on the left made such intellectually dishonest accusations doesn’t make the reverse acceptable, and (2) the people calling Bush et al fascists were a small minority and by no means in the mainstream—which cannot be said of Rush Limbaugh, Glenn Beck, and the rest of the Fox News contingent. Red State's insanity is not limited to just such wingnut blogs: when an Obama spokesperson announced earlier this week that "Fox News often operates either as the research arm or the communications arm of the Republican party. We're going to treat them the way we would treat an opponent," Glenn Beck interpreted this as an infringement of free speech and also invoked Pastor Niemöller’s poem: “Ask yourself this question: When they are done with Fox and you decide to speak out on something … the old, ‘first they came for the Jews and I wasn’t Jewish’.”

(I recently taught George Orwell’s famous essay “Politics and the English Language” to my first-year English class, and have been working on a post about words and phrases that need to be shelved until people are ready to use them intelligently again. Not to give too much away, but pretty much anything involving fascism, Hitler, and Nazism are all on the list.)

To return to my original topic here, my favourite part of the whole Limbaugh-NFL saga is that, for all the cries of liberal interference and thought policing, what really is at work here is capitalism in action. In spite of what some conservative commentators would like to believe, it wasn’t Jesse Jackson and Al Sharpton who sabotaged Rush’s ownership bid, but the other people with whom Rush was planning to buy the Rams. The other rich white men pooling their resources dropped Rush at the first sign of rough water, basically as soon as the NFL and the players’ union expressed concern about Rush’s history of making racially charged comments. This, to be clear, was not a prohibition exercised from on high by the forces of political correctness, but a business decision. The NFL is pretty controversy-adverse, and it is doubtful whether the ownership bid would have ultimately been successful if Rush’s cabal had stuck to their guns; but the fact is they didn’t, in the interests of their prospective investment. If Rush wants to portray himself as a martyr, he should be attacking his erstwhile business partners for lacking intestinal fortitude and the courage of their convictions—but of course he won’t, because their convictions begin and end with the bottom line on a ledger, and the ideology of money is one Rush embraces. So he and his dittoheads will attack the left, the Obama Administration, the NFL itself, and Bob Costas (seriously), but in the end Rush’s very public opinions on race made him toxic to a business that trades to a large degree on the goodwill of a large African-American constituency.

And that’s not censorship—that’s being forced to take responsibility for one’s words.

Friday, October 16, 2009

Danny Williams poses with Arnold Schwartzenegger for a photo op ...


... and somewhere in Toronto, a The Hour Has 22 Minutes writer jizzes in his pants.