Wednesday, November 25, 2009
I have little doubt about two things here: first, that it is entirely likely there were any number of answers of substance and intelligence that didn’t make the cut because they didn’t fit the overall narrative; and two, that one could have easily cobbled together a comparable series of interviews at Obama’s election night rally in which people vapidly spouted empty rhetoric about hope and change. However, there is a third thing I have little doubt about as well, and that’s that the Palin supporters who can speak substantively to the issues are always going to be vastly outnumbered by Obama supporters who can.
I can claim this with confidence for two reasons. The first is simply the law of large numbers: whatever spin has been put on Palin’s popularity, she is actually a lot less popular than she has been made out to be; and her favourability rating is significantly lower than Obama’s (see Christopher Beam’s astute distinction between “favourability” and “job approval” here—he makes the point that Palin’s job approval rating can’t be measured because, well, she has no job). Hence, the proportion of naturally-occurring intelligence translates into higher numbers for Obama.
More importantly however, the number of Palinites able to speak substantively to policy issues will be significantly less because Palin herself has no substance. None. And it has been exasperating and baffling to read a series of op-eds this week and see otherwise intelligent people falling victim to the illusion that Palin’s largely media-driven persistence on the American stage is evidence of substance. Rex Murphy in The Globe and Mail and Maureen Dowd and Frank Rich in The New York Times all opined that people on the left (and some on the right) claiming that Palin is a “joke” need to rethink their position—that she does in fact possess political wiles and savvy well in excess of what people imagine. Dowd states that Palin “reigns over thrilled subjects thronging to a politically strategic swath of American strip malls” and that “Democrats would be foolish to write off her visceral power.” Rich claims, “Palin is far and away the most important brand in American politics after Barack Obama, and attention must be paid.”
It was however Rex Murphy’s column in this past weekend’s Globe that most dismayed me, as whatever issue I might take with some of his positions, I can usually count on him for good contrarian common sense. His column was particularly irritating because he made some spot-on observations, but framed them in assumptions about the character of Obama supporters, and brought it all to an embarrassingly (for him) wrong-headed conclusion.
He says, “It will make Obama fans perspire to hear this, but Ms. Palin has a more forceful bond with her supporters than he with his.” This observation is exactly right, but I wonder why he feels the need for the caveat—Obama supporters are well aware of Palin’s forceful connection with her base. Indeed, it is one of the key things that worries many people about Palin: the bond is not intellectual but instinctive, proceeding not from the mind but the gut, and it is a manifestation of the most troubling elements of American reactionary nativism. In this she is not, as Mr. Murphy suggests, a unique new force in American politics, but the latest of American conservative politicians (following the likes of Dick Armey, Karl Rove and Mike Huckabee) to achieve a radioactive half-life balanced somewhere between actual elected officials and pundits like Glenn Beck and Sean Hannity. She is, to coin a term, a politainer: a political actor concerned not with policy or governing but her own specific brand, and when she pronounces on policy delivers what Jon Stewart (always the voice of sanity) calls “a conservative boilerplate mad-lib … delivered as if it were the hard-earned wisdom of a life well lived.”
It is his conclusion however which is most egregious: “Ms. Palin has rare gifts and stamina enough to give them play.” Certainly, she has a keen ability to cater to in reactionary American nativism; whether that counts as a “rare gift” is doubtful. But stamina? This is the governor who quit well before her term was up, just when tanking oil prices put Alaska’s economy in a spin; who didn’t actually write her own book; and whose political pronouncements emanate from Facebook and Twitter, not exactly forums that allow complex or nuanced thought.
Yet, Mr. Murphy observes, people respond to her intuitively and viscerally, and where there’s smoke there must be fire—forgetting, presumably, that in politics where there’s smoke there’s more likely to be mirrors. The abject loathing not just from the left but from the conservative intelligentsia equates presence, apparently; he writes, “A truly dumb and witless person would not have the demure columnist David Brooks hissing dismissively, angrily in fact, on a Sunday morning talk show that Sarah Palin ‘is a joke’ … Empty vessels do not inspire such venom and fury.” With all respect, Mr. Murphy needs to take a longer look at the culture of celebrity today—empty vessels fire the popular imagination as they never have in the past, and the heat they generate is no evidence of light.
Tuesday, November 17, 2009
(I also have a post on the CAASblog about the conference here).
The annual CAAS conference is a highlight for me, in part because it always feels a bit like going home (even when it’s not actually in St. John’s or London). I am and have been a member of a variety of scholarly societies, and usually attend the Congress of the Humanities every year in May. CAAS is the pleasant antithesis of the Congress—small (this year there were about ninety papers), tight-knit, friendly, and supportive. There’s something to be said about seeing the same faces every year, especially when you can be confident that this will lead to fruitful collaborations and discussion.
CAAS is a comfort on this front, because academic conferences—especially the larger they get—can be, if not necessarily daunting, then certainly odd.
There’s always a moment at conferences, usually when I’m sitting in on a panel whose papers are all way outside my areas of expertise and interest, when I reflect on the strange beast that is the academic humanities conference. I’ve tried to explain to non-academic friends and family the ostensible purpose of usefulness of conferences, and I have on occasion defended them in principle when friends and colleagues in academia inveigh against them as wastes of time, money, and energy. The thing is, I sort of agree with both perspectives; I always look forward to conferences as a chance to travel (even when the location is no great destination), to see friends and colleagues from other universities, and, yes, to see papers and engage in discussions about the subjects being discussed.
At the same time, there are always points at which the conference experience can be excruciating—even when the topics or themes under discussion are of interest to you. It is at times like that that I wonder why we do it, usually when I’m sitting in an overly heated hotel conference room in an uncomfortable chair trying to be interested in a poorly presented paper droning on well past its allotted twenty minutes … with no indication that the panel chair will be taking steps to bring the presenter to a close. Of course, the disturbing thing at those moments is wondering how many people in my audience will be thinking that about my paper.
The fact is that at your average conference you have to kiss a lot of frogs, and realize as well that you will yourself be one of those frogs for any number of people. Which is why, in my experience, the larger the conference the more likely that feeling of futility will surface. I return to CAAS every year knowing that the conference will be more of a collaborative effort than an atomized group of people there to put in the twenty minutes necessary to get a line on a CV.
Saturday, October 31, 2009
Saturday, October 17, 2009
What amuses me the most, and what I find most instructive about this minor but entertaining incident, is the “free speech” spin Rush and his apologists have put on it. Joseph Ziegler at Big Hollywood, speaking about the snub, stated, “I strongly believe that it also represents a seminal moment in our cultural history as well as the sad state of free speech in this country.” This kind of response is typical of a certain species of ultra-conservative commentator, flame-throwers who spew hatred and untruth and then whine that their first amendment rights are being abrogated when their comments are subjected to criticism. David Horowitz, the serially mendacious attacker of liberal bias in universities, employs this gambit quite frequently—often claiming that the fact he is not invited to speak at campuses is a form of censorship (and has nothing to do with the hefty fee he demands, or that most of those booking speakers don't tend to invite people who will then spend their lecture attacking them).
Even more bizarre was a post at the conservative blog Red State, which eulogized Rush’s failed ownership bid in language I really, really hope is meant to be facetious: “Earlier this evening, as most of you now know, one of our own, Rush Hudson Limbaugh, while taking withering fire, crashed and burned. Tonight, Rush is no longer ‘just’ a radio personality. Tonight, Rush is no longer ‘just’ a NFL owner denied. Tonight, Rush is us. And we are him.” The “attack” on Limbaugh, the poster maintained, is representative of the left’s totalitarian intentions, which “proved that they will stop at nothing to end our dreams.” As egregious as this claim is, it is topped at the end when the poster quotes, in its entirety, Pastor Martin Niemöller’s infamous poem about the Nazis:
First they came for the communists, and I did not speak out—because I was not a communist;
Then they came for the socialists, and I did not speak out—because I was not a socialist;
Then they came for the trade unionists, and I did not speak out—because I was not a trade unionist;
Then they came for the Jews, and I did not speak out—because I was not a Jew;
Then they came for me—and there was no one left to speak out for me.
Seriously? You’re seriously going to compare the failed bid of a multi-millionaire to buy a crappy NFL franchise to the Holocaust? The rhetoric has gotten way out of hand these days, and when American conservatives claim equivalency between their analogies between the left and Nazis and the left’s cries of “fascism” against the Bush administration, they’re ignoring two crucial points: (1) just because some extreme voices on the left made such intellectually dishonest accusations doesn’t make the reverse acceptable, and (2) the people calling Bush et al fascists were a small minority and by no means in the mainstream—which cannot be said of Rush Limbaugh, Glenn Beck, and the rest of the Fox News contingent. Red State's insanity is not limited to just such wingnut blogs: when an Obama spokesperson announced earlier this week that "Fox News often operates either as the research arm or the communications arm of the Republican party. We're going to treat them the way we would treat an opponent," Glenn Beck interpreted this as an infringement of free speech and also invoked Pastor Niemöller’s poem: “Ask yourself this question: When they are done with Fox and you decide to speak out on something … the old, ‘first they came for the Jews and I wasn’t Jewish’.”
(I recently taught George Orwell’s famous essay “Politics and the English Language” to my first-year English class, and have been working on a post about words and phrases that need to be shelved until people are ready to use them intelligently again. Not to give too much away, but pretty much anything involving fascism, Hitler, and Nazism are all on the list.)
To return to my original topic here, my favourite part of the whole Limbaugh-NFL saga is that, for all the cries of liberal interference and thought policing, what really is at work here is capitalism in action. In spite of what some conservative commentators would like to believe, it wasn’t Jesse Jackson and Al Sharpton who sabotaged Rush’s ownership bid, but the other people with whom Rush was planning to buy the Rams. The other rich white men pooling their resources dropped Rush at the first sign of rough water, basically as soon as the NFL and the players’ union expressed concern about Rush’s history of making racially charged comments. This, to be clear, was not a prohibition exercised from on high by the forces of political correctness, but a business decision. The NFL is pretty controversy-adverse, and it is doubtful whether the ownership bid would have ultimately been successful if Rush’s cabal had stuck to their guns; but the fact is they didn’t, in the interests of their prospective investment. If Rush wants to portray himself as a martyr, he should be attacking his erstwhile business partners for lacking intestinal fortitude and the courage of their convictions—but of course he won’t, because their convictions begin and end with the bottom line on a ledger, and the ideology of money is one Rush embraces. So he and his dittoheads will attack the left, the Obama Administration, the NFL itself, and Bob Costas (seriously), but in the end Rush’s very public opinions on race made him toxic to a business that trades to a large degree on the goodwill of a large African-American constituency.
And that’s not censorship—that’s being forced to take responsibility for one’s words.
Friday, October 16, 2009
Monday, September 28, 2009
So the so-called "Flamer" movement I mentioned in my last post? A spoof. It is one of a number of groups on the left replying to militant anti-health care rhetoric with parodies of its worst excesses. My favourite so far was the "Billionaires for Wealthcare" who attended Glenn Beck's 9/12 rally in Washington, carrying signs saying things like "If we ain't broke, don't fix it" and "Our death panels turn a profit!" They also sang a song to the tune of "The Battle Hymn of the Republic, whose refrain went:
If our healthcare corporation
Never faces regulation,
We'll be brimming with elation!
Let's save the status quo!
So, the "Flamers" (really, the name should have tipped me off, but then again the people they're parodying initially embraced the moniker "teabaggers" before someone told them what that actually meant) are of a piece with these guys, responding to the Glenn Becks of the world in what is probably the only rational way.
Of course, when you have people at town halls holding signs like this:
... or white supremacists marching for "white civil rights," that line between satire and reality gets disturbingly blurry.
Saturday, September 26, 2009
1. “I’m sorry, you haven’t paid your fees. I’m afraid we have no choice but to let your house burn.”
First there were the Birthers, those people who claim Obama wasn’t born in America. Then the Deathers, who maintain government-run health care will result in “death panels” deciding whether you get treatment or get to die. And then the Tenthers, a group that interprets the Tenth Amendment as meaning that individual states have the right to reject any law issuing from the federal government.
Now, apparently, we have the Flamers. No, they’re not gay Republicans or Calgary fans; they are a group dedicated to, and I quote, “The privatization of everything in America.” That’s right—EVERYTHING. And their nickname? It derives from their favourite project, which is the privatization of firefighting in the United States. Troy Conrad, president, says “I shouldn’t have to pay for your fire.” Firefighting, he maintains—like everything else—should be a for-profit business. “All we know,” he says, “is that the free market always works as long as it’s unregulated. Even if that means a lot of people have got to die, it’s still worth it, because unregulated free market is the way to go.”
Watch him explain his point:
I watched this twice through, trying to figure out if this was just an elaborate hoax. I’m still not entirely sure it’s not, but there seems to be evidence that it’s genuine. They even have their own website, called Angry Town Hall.
I couldn’t make this shit up if I tried.
What is perhaps weirdest about this for me is that one of the best arguments for U.S. government-run health care I read this summer, and one of the best debunking of the “socialist” accusation, was an op-ed piece pointing to government takeover of fire prevention in the mid-nineteenth century. Fire-fighting was, once upon a time, run on a private for-profit basis. The government finally stepped in because, besides the catastrophic damage that predictably occurred when fires raged in poor and hence unprofitable parts of cities, there were increasing numbers of suspicious fires occurring, as well as firemen standing idle until the property owners agreed to pay a much steeper fee.
2. Sarah Barracuda hits the lecture circuit
For an undisclosed fee probably in the hundreds of thousands, Sarah Palin delivered a talk to a group of investors in Hong Kong. Apparently, she spoke for about ninety minutes, the very thought of which makes my head hurt. I would pay a lot of money to not have to listen to her.
The lecture was closed to the media, but a few enterprising souls discreetly recorded it. I haven’t read the full transcript, but here’s a small sample of Palinese to make you nostalgic for the 2008 campaign trail:
"Personally, I’ve always been really interested in the ideas too about the land bridge. Ideas that maybe so long ago, had allowed Alaska to be physically connected to this part of our world so many years ago. My husband and my children, they’re part [unintelligible] Eskimo, Alaskan natives. They’re our first people, and the connection that may have brought ancestors from here to there is fascinating to me. Making our world seem a little bit smaller, more united, to consider that connection that allowed sharing of peoples and bloodlines and wildlife and flora and fauna, that connection to me is quite fascinating."
Ah, there’s the lyrical nonsense I’ve been missing. If only we could get William Shatner to do a spoken word performance of this one too.
The best part of the speech however was where she, with laserlike precision, identified the site and source of last year’s economic collapse: "I'm going to call it like I see it and I will share with you candidly a view right from Main Street, Main Street U.S.A. ... We got into this mess because of government interference in the first place. We're not interested in government fixes, we're interested in freedom.” Ah, I see. The meltdown happened because of overregulation.
Yes. Yes, this is truly the person Americans want representing them to the biggest creditor. Wise move. She should get together with the flamer guy. They would have a lot to talk about.
3. Put ‘im on Signal Hill, I says!
For a little while there, it looked like Libyan leader Muammar al-Qaddafi was going to be stopping over in St. John’s for a night as his plane refuelled. Alas, it is no longer to be—he has cancelled all the reservations for his one-hundred-plus retinue. Why, I couldn’t say. It might have something to do with not wanting to be chastised by our foreign affairs minister, who was going to fly to St. John’s specifically to upbraid him for giving a hero’s welcome to the Lockerbie bomber.
My theory is that there was a problem with the tent. One of the interesting factoids that has surfaced this week while Qaddafi was in New York is that when travelling he brings his own Bedouin tent to sleep in. He had difficulty finding somewhere to pitch it in New York—apparently no one wanted him in their back yard—and there was discussion of where he would set up camp in St. John’s. As a friend of mine observed, “A tent? In Newfoundland in autumn? It’s like a ready-made CODCO skit."
I don’t think the cancellation was because of the planned rebuke. I think he just found out what the Newfoundland climate is.
The worst part of this saga is that I now have a mental image of Qaddafi pitching a tent, and that’s just not something I want in my head.
4. Darwin=Hitler. Who knew?
And lastly, everyone’s favourite child-actor-turned-evangelical Kirk Cameron is, with the assistance of his friend Ray Comfort (known as the Banana Man), publishing Darwin’s Origin of the Species and distributing it to students ... with a new fifty-page introduction, that is. This introduction outlines “the history of evolution, a timeline of Darwin’s life, Adolf Hitler’s undeniable connection with the theory, Darwin’s racism, his disdain for women, and Darwin’s thoughts on the existence of God.” Um, what? Hitler’s what with the what? Obviously, I need to get my hands on a copy of this. I have missed a crucial historical narrative apparently, in which Hitler travels back in time and forces Darwin to write Origin between his athetistic race-baiting and misogyny. So much I have still to learn.
Incidentally, the official Nazi line on evolution was that the Aryan race crashed fully formed on Earth frozen in a comet. But they were down with everyone else coming from apes. Which, when you think about it, totally confirms Kirk’s thesis.
That boy needs to learn about the logical fallacy called the “reduction ad Hitlerum.” But let him make his case himself:
I love the bit where he says, “All we want to do is present the opposing and correct view, without being censored ... which is exactly the case at present.” All I have to say to that is: Um, atheistic censor bureau? Can you please get your shit together? Obviously, you’re not doing a good enough job.
Thursday, September 24, 2009
Well, I suppose if I must, I must...
As long as I'm on the subject of all things Irish, I should mention in passing that last week U2 played in Toronto and I WASN'T THERE. Kristen was, as well as a significant number of other friends of mine, but there was no way I was getting away in the first week of classes, alas.
I'm rather happy that U2 is touring while my parents are in Ireland, because my parents are exactly the sort of people who would wander into a small pub in Dublin and befriend the four guys having a quiet pint, and later tell me "They were all very nice, even the guy in the sunglasses. He wouldn't shut up about Africa. They said they were all in a band. Have you heard of them?"
Wednesday, September 23, 2009
Yup, she’s yet again taking aim at the contemporary Canadian academy, and applying the trademarked Wente rhetorical strategy—namely, cherry-picking one or two anecdotal incidents or observations and expounding from them to a broad generalization of outrage.
In this case, it’s the fact that professors, apparently, don’t teach. She cites the fact that today’s undergraduate experience tends to involve large, if not massive classes, often “taught by itinerant graduate students” rather than professors. “Classes are held in giant amphitheatres,” she continues, “with multiple-choice tests instead of essay questions.” She goes on to observe that the dropout rate of undergraduates it at an “all-time high,” with 30 percent bailing after the first year and only 56 percent finishing after six years.
This much is undeniably, and unfortunately, true. I could go further and point to the fact that the balance of teaching, especially crucial introductory courses, is now done not by full-time professors (or even itinerant graduate students) but by part-time and contractual faculty who have few benefits, no job security, and often don’t know what or how much they’ll be teaching—and hence how they’ll support themselves financially—until mere weeks before classes start. The ratio of courses taught by sessional faculty to full-time is usually around two to one, sometimes three to one. This is a situation that has been worsening for many years now as budgets get cut and departments are increasingly told to do more with less, with predictable ripple effects within among our student populations.
This has, indeed, become one of the Gordian knots of the academy both here and in the U.S., much puzzled and fretted over at all levels of the university. There are a host of reasons why we have arrived at this impasse, none of them reducible to a simple set of causes. However, never one to let complexity or nuance dissuade her from an outraged generalization, Ms. Wente sums up the site and source of the problems as follows:
“The universities say the problem is money. If only they had more of it, they could do a better job of educating undergraduates. There's just one catch. Educating undergraduates is just about the last thing most professors want to do.”
Huh. As it happens, I’m writing this on a break from preparing the three classes I’m teaching tomorrow, which all together total about one hundred and thirty students. Two of them are first year classes, one the standard first-year English that every student at MUN has to take, and the other an advanced composition course. I’m also teaching my usual twentieth-century U.S. fiction class, which this year is necessitating a lot more work because I decided to teach all novels I’ve never taught before, in an effort to keep the material fresh.
And however much I might complain about aspects of teaching, this is to my mind the best part of my job. I love teaching undergraduate courses. Now, of course, that’s just me—but honestly, I have met very few professors, either here at Memorial or at my time at Western, or in the larger peer circle of Canadian academia, who have not taken undergraduate teaching seriously and devoted much time and energy to providing their students with the best instruction they can. There are exceptions to this rule, of course, but they are just that—exceptions. And I take exception to Ms. Wente’s attempt to invert the ratio of this rule. To hear her speak, we all would rather bury ourselves in our research and ignore the undergraduate populations of our campuses entirely.
In true cherry-picking fashion, she supports her claims with a quotation from a U of Manitoba professor: “‘My colleagues do everything they can to get out of teaching,’ says Rod Clifton, who works in the faculty of education at the University of Manitoba. ‘They'd rather not have the students around, because they'd rather do research and stand around and sip sherry.’” Ah, the indolence of research, which apparently occurs during booze-soaked mixers in the faculty lounge. The fact that neither I nor my colleagues drink sherry notwithstanding, I’m not sure whether Dr. Clifton’s comment is meant to be facetious or intellectually dishonest; either way, it doesn’t bode well, in the absence of other evidence, for Ms. Wente’s argument.*
If professors’ reluctance to teach is the principal stumbling-block to a quality undergraduate education, its partner in crime is the research that preoccupies them and takes them away from the classroom: “Professors are rewarded not for turning out high-quality graduates, but for turning out books and papers – even if they are unread. This perverse system stubbornly persists, despite the fact that everyone knows it's absurd.” While generously granting that “some research,” useful, specifically in science and medicine, Ms. Wente effectively dismisses everything produced in the humanities. “Nobody,” she states, is “clamouring for another book on Moby-Dick.” I suppose this is true enough, as far as it goes; and it is hard to deny that the research requirements now leveled on professors are far more onerous than back in the halcyon days when Ms. Wente was an undergrad, when “classes were small and many of our professors were creative and enthusiastic,” to the point where “some of them were happy to hang around with us drinking coffee, smoking dope and arguing about Blake and life.”**
What this dismissal of research in the humanities misses is something crucial to the nature of the university itself. If teaching the Great Books and chewing the fat about Blake and Life, the Universe, and Everything were all that were involved, it hardly seems necessary to demand of professors the accreditation of a PhD. What research is largely about, whether the articles and books produced are read or not, is demonstration of an engaged and enthusiastic mind that didn’t freeze at the moment of the thesis’ defense. Part of the philosophy behind the university as a whole is that individual professors’ research make them better and more relevant teachers to both the undergraduate and graduate students.
Incidentally, this point was made today much more eloquently in the Globe and Mail by Clifford Orwin, a political science professor at U of T. He writes, “my teaching depends on that research. To teach is to communicate enthusiasm for learning, and what sustains that enthusiasm is continuing to learn yourself. It's also to set an example of progress to nourish in your students the hope that they too can contribute to progress. No, not all research done at universities is valuable. The surprise is how much of it is.”
I’ll end this post by observing that I’ve come to the conclusion that every time Margaret Wente is at a loss for something to whinge about, she pens an anti-university column. It seems to happen two or three times a year, and I’d really like her to make up her mind. Are we ivory-tower mandarins locked into cultural irrelevance? Or are we providers of pop-culture dreck who siphon off unwarranted federal research funds in our ongoing quest to bury the Great Books under layers of obfuscatory “theory”? Are we cheating our students of the great lessons of civilization by denying them those Great Books? Or are we cheating our students futures by not steering them into math and science and away from the irrelevant humanities? All these are variations on themes I have read in her columns, and taken together it becomes something of an inchoate but intense dislike that, I think, makes a little more sense if you read the opening paragraph of yesterday’s column. I’ve already quoted part of it, but here it is in its entirety:
“I went to university back in the golden age. Our classes were small and many of our professors were creative and enthusiastic. They even marked our papers themselves. There was lots of scope for what is now known as ‘engagement,’ which means that although we were undergraduates, some of them were happy to hang around with us drinking coffee, smoking dope and arguing about Blake and life.”
Nostalgia is a treacherous thing, for it distorts not only our memory of the past but our perception of the present. It makes me wonder if Ms. Wente so dislikes what she sees in the present academy because she resents that contemporary students don’t get this sort of experience … or because they do, but in her mind it could never rival that “golden age” (itself a deeply problematic concept that I would challenge her on, were she my student). What she describes in this passage is not at all a relic, but still something that many, many undergraduate students experience today (pot-smoking professors and all). Perhaps instead of Blake, they’re talking about Foucault, or Kathy Acker, or Quentin Tarantino. Or maybe even Blake, why not? Whatever she may believe, her idyllic university experience has not yet passed from this earth.
*Incidentally, were she to submit this column as an essay in my advanced composition class, I would grant it a C+ largely on the strength of being reasonably well written and possessing a clearly stated thesis; this recourse to a single piece of anecdotal evidence however fails utterly to make the connection to the statistics cited earlier in the piece or to persuasively account for them. I would call this a fallacy of insufficient inductive reasoning.
**Given the general capriciousness of Margaret Wente’s antipathy to the academy, I would lay bets that if there were a sudden rash of socializing dope-smoking profs hanging out in coffee shops with impressionable students, we’d be seeing a column on (a) inappropriate professorial behaviour, (b) evidence that professors are lazy and not earning their salaries, or (c) “THIS is where students’ tuition dollars are getting them?” Take your pick.
Friday, September 18, 2009
My question is this: Is it possible for Glenn Beck to be MORE of a jackass?
Seriously, he has redefined conservative demagoguery ... and by redefined, I mean made it so inchoate, self-contradictory and, well, batshit insane, that it would function as a parody of the whole genre if it weren’t taken seriously by a scary number of people.
But it was the word “parody” that set off the light bulb, and gave me the blinding epiphany that finally answered the haunting question.
I suddenly realized. Glenn Beck isn’t real. He’s a character being played by Sacha Baron Cohen.
Wow. I mean, wow ... nicely played, sir. First Ali G, then Borat, then Brüno. But this one takes the prize, it really does. Kudos.
But, um, don’t you think you’ve taken this one a little bit too far by now?
Thursday, September 17, 2009
None of the political wisdom and scholarship I have encountered in my life has managed to sum up so neatly the constant frustration with the venality, posturing, graft, waste, and partisan hackery of liberal democracy and the inescapable fact that the vast majority of us wouldn’t trade it for anything. Is democracy messy? Yes. Inefficient? Yes. These two elements alone make frustration with government and contempt for politicians the third and fourth constants in life after death and taxes. Which is why I’m generally willing to cut politicians a little slack for their failings, if for no other reason than those very failings are more or less inevitable on some level, unavoidable by-products of a political system based on horse-trading and compromise. Max Weber wasn’t exaggerating when he observed that one attempts to effect political solutions to social problems at nothing less than the cost of one’s own soul.
All this is apropos of our current shifting political landscape, or what we more commonly call a minority government. Rick Salutin had a column in the Globe last weekend in which he gently chided those complaining about the possibility of a fall election: “In a vital democracy, like ancient Athens or the Iroquois confederacy, people were involved in politics continually. Under our system, politics more or less equals elections, so you could call frequent elections our form of participatory democracy. It keeps citizens engaged and parties on their toes.” He then asks whether we’d have seen even the “minimal action” Harper has made on the economy or Afghanistan if the Tories had a majority.
Would I be happier with the Conservatives out of power? Absolutely. But a Harper minority is in a variety of ways preferable to a Liberal supermajority, for the simple reason that it tends to prevent complacency and the insufferable arrogance and tone-deafness that finally brought down the Liberals and would be worse (I believe) by a magnitude under a Harper majority. To keep his party in power, Harper has been forced to make all sorts of progressive concessions that would be unthinkable with a majority, and in response to Ignatieff’s saber-rattling he is floating concessions on EI spending in an attempt to garner NDP and Bloc support.
I love this for two reasons, one petty and one idealistic. The petty reason is that it puts egg on Harper’s face, who around this time last year was condemning the Liberals’ willingness to work with “socialists and separatists.” My, how the wheel turns.
The other reason is that I believe this is how it’s supposed to work. The whole point of having ideologically opposed political parties is to have them act as checks and balances on the other and mitigate their excesses. Would I prefer to have Prime Minister Ignatieff? Probably. But a Stephen Harper forced to reach to Iggy’s left will do nicely for the time being, thank you very much.
Tuesday, September 15, 2009
Jonathan Chait has an excellent article in The New Republic discussing two new biographies of Rand. He offers some excellent analyses of Rand’s particular brand of philosophy, but also puts his finger on what is the biggest flaw in her celebration of individual accomplishment and excellence: namely, that income is the purest denominator of success, and that “elite” in the Rand vocabulary is invariably synonymous with “business elite.” It is as if, quips Matthew Yglesias, “an Albert Einstein is just a kind of middleweight hack but the VP for Marketing at Federal Express is one of [the] ubermenschen.”
I had not, I must admit, read much Rand until recently. During my undergrad I read one of her lesser novels (Anthem) and imbibed enough of her philosophy to be entirely turned off; during the writing of my dissertation I had the entertaining experience of reading her Screen Guide for Americans, a guidebook she wrote for the Motion Picture Alliance that offered advice on how to detect, identify, and avoid communist influence in films (truly, a page turner, especially if you don’t quite understand how Frank Capra and Henry Fonda were raging Reds). I have lately been working (slogging) my way through Atlas Shrugged at the behest of a student who, as a Rand enthusiast, quite rightly suggested that if I wanted to mock Rand I should put my money where my mouth is and read what is considered her masterpiece.
So far? Unimpressed. I can see where the narrative is going, and while the celebration of personal accomplishment is always commendable, Rand’s philosophy is relentlessly bloodless and isolating. Putting aside for the moment the fallacy of making an absolute connection between income and excellence, Rand’s particular brand of individualism is so relentlessly militant it is anathema to any form or incarnation of community. Anything that impinges upon individual accomplishment—any obligation that individual has to other people, ethically or otherwise—Rand rejects as the leading edge of collective mediocrity eroding the heroic individual.
I suppose it’s because my own conception of “great accomplishment” tends to lean more toward the civic or political, or triumphs of the imagination creative or empirical—in general, accomplishment that presupposes the value of community and social contract, and the need to contribute to it—that makes Rand’s John Galts and Howard Roarks compelling but ultimately hollow for me. Shakespeare wasn’t a millionaire and Mozart died a pauper, after all, and as far as "accomplishment" goes, I'd put them ahead of Nelson Rockerfeller any day.
Friday, September 11, 2009
Monday, September 07, 2009
Or, well, I should be starting to sweat a little, but the day is making it difficult. It is bright and sunny and cloudless, but there is a little bite in the air that speaks of autumn. This is, and has been for as long as I can remember, my favourite time of year. I love the sense of renewal and hope the new school year tends to bring, like untouched snow or a blank piece of paper. I’ve always felt like this in September, even during the dark years of teenage angst when school was a trial and a burden. For a few weeks I could imagine it would be otherwise, and then one day—in my last year of high school—it was, and university proved an even better and more rewarding experience.
In the New York Times today there is a collection of advice for new university and college students written by such academic luminaries as Stanley Fish, Harold Bloom and Martha Nussbaum. Should any students happen across this blog looking for info about your English prof this fall, I highly recommend it—as well as anyone else who stumbles into my humble little spot on the InterTubes here. There’s some good advice there, especially that of Gerald Graff, Gary Wills and Nussbaum. Harold Bloom’s advice to discover the Great Books is near to my heart, but try not to be turned off by his high-handedness.
Also, Carol Berkin’s bit about how not to alienate your professor is a bit of wisdom to write down on the first page of all your notebooks. Seriously. Write it down and commit it to memory.
Seeing as how the Times was remiss in not asking me for my own wisdom on starting your university career (obviously, some wires were crossed somewhere), I’ll share here what I used to tell students about to start at the University of Western Ontario, my alma mater. When I was in the final stages of my doctorate and teaching as a sessional professor, I also worked as an advisor to students enrolled in the Media, Information and Technoculture degree. Before talking to them one on one, I addressed them as a group and said (more or less) the following:
One of my favourite professors during my undergraduate degree was fond of saying that university was your last opportunity to fail magnificently. What he meant by this was that these coming years are your time to try out new things—new ideas, new ways of thinking, new modes of yourself—and to put who you are to the test. You should not be afraid of failure, because we learn more from failing magnificently than succeeding blandly. Many go into their degree knowing exactly what they want to do and be, and find themselves suddenly discovering that they were wrong—that that degree in physics, or business, or pre-law, or political science, or yes, even English, was not for them. At all.
This of course doesn’t happen to everyone. Many, if not most, cheerfully soldier on through with little existential angst. To a certain extent, that is unfortunate—a little existential angst in your early twenties is a rite of passage, and good for the soul. When and if it happens, remember the old adage that no crisis should ever be wasted, and strike out in different directions (just as Martha Nussbaum suggests in the Times)
I am somewhat more hesitant to encourage my students to fail magnificently then was my own professor, not least because in the face of ever-rising tuition and a tenuous world economy, such advice is costly at best and frivolous at worst. But its spirit is always sound, for it speaks to that greatest of human virtues: curiosity. If I have any one piece of advice, it is this: be curious.
(It is serendipitous that my last Weekly Wisdom quotation was on this very subject).
Speaking as an educator, there is nothing more soul-destroying than apathetic, indifferent and incurious students. University is a time of discovery and exploration, whatever your degree. It is an inescapable fact that the student who muddles through with a C average, having done the bare minimum and cared nothing for anything more than is necessary to pass, gets the same degree as the student who works hard and finishes with an A. There are those who hold up this fact as evidence of a bachelor degree’s fatuity, but those are the same students who mistake a degree for the piece of paper they get at the end.
University is process, not product. It is transformational: I see this every year when the light bulb goes off for students and some idea or philosophy or author or school of thought changes everything. It is also not for everyone—some people find comparable transformative experiences travelling or working or going to college. Being open to those possibilities however, to any possibilities, is the start of wisdom.
So be curious. Ask questions. And I’ll see you in class on Thursday.
Wednesday, August 26, 2009
Tuesday, August 25, 2009
Ever have it happen that two of your very favourite things combine? Like chocolate and peanut butter, only better?
I am an avid but selective reader of fantasy novels—avid because I’ve loved the genre since discovering Tolkien and C.S. Lewis at the age of eleven, and selective because so much of the genre is cliché, borderline misogynist, and generally really badly written. There are however some authors who raise the genre back up to Tolkien-esque heights, whose novels are innovative and well-written. Canada’s own Guy Gavriel Kay is one of my favourites; Robin Hobb’s recent “Soldier Son” trilogy was a splendid allegory of imperialism and indigeneity; readers of this blog know me as a Harry Potter fan; Philip Pullman’s His Dark Materials is mind-blowing; and of course I have become a recent devotee of Sir Terry Pratchett. (I also play World of Warcraft on occasion, though right now my enthusiasm for the game is at low ebb—it goes in three to four month cycles).
There is however one fantasy series at the apex of the genre right now, and that is George R. R. Martin’s A Song of Ice and Fire. I starting reading it in 1996, when the first installment—A Game of Thrones—was out in hardcover. It was an accidental acquisition: my brother Matt worked then at Chapters doing IT, and as part of their bonus they were taken into the overstock room and allowed to grab an armload of books. Not knowing much about my reading habits, but knowing I liked fantasy, Matt grabbed it.
That was thirteen years ago, and if I have any complaint about the series it’s that Martin is exceptionally slow in writing the novels. The series is now projected to run to seven novels, and in those thirteen years he has produced four: A Game of Thrones (1996), A Clash of Kings (1998), A Storm of Swords (2000), and A Feast for Crows (2005). Considering the heft of each installment, two years between each of the first three is reasonable; but then we have to wait five years for number four, and we’re still waiting on number five … with two more projected after that. Never have I been so deeply invested in an author’s continuing good health.
HOWEVER … for some time now, there has been speculation about the saga being turned into a television series, with each novel comprising one season. Now there is confirmation that this is happening, with season one starting next year.
That’s exciting in and of itself. What puts me over the moon is that the series is being produced by HBO!! H, B, fuckin’ O, as Deadwood’s Al Swearengen might say … so not only is my favourite ongoing fantasy series making it to TV, it’s going to be GOOD.
Long-time readers of my blog know my love for all (or, well, most) things HBO, and that I have picked up a little side-scholarship writing articles on such key series as Rome, Deadwood, The Wire and, most recently, Oz. I look at HBO as the flagship of the smart TV fleet, consistently producing brilliant cinema-quality series that make television watching almost a literary endeavour. My favourite fantasy series on my favourite cable network?? Too … much … goodness …
Breathe, Lockett … breathe …
What doubles down today on my geeking out is that Martin announced on his blog the other day some key casting for the principal roles. Some are actors I don’t know, or know tangentially, but they all look perfect.
What follows will make little sense if you haven’t read A Song of Ice and Fire. Just warning you.
Tripling down on my geeking out is who they’ve cast in the central role of Ned Stark:
Yes, that’s right … Sean. Fucking. Bean. Boromir himself. Richard Sharpe. And now Ned Stark. It’s like someone gave him a reading list of my favourite things and he’s made it his life’s work to act in the film adaptations of as many as possible.
Next, Ned’s long-suffering wife Catelyn Stark: Jennifer Ehle.
For those who find her vaguely familiar but can’t place her, here’s a picture from her best-know role:
Robb Stark: Richard Madden.
Sansa Stark: Sophie Turner.
King Robert Baratheon: Mark Addy.
Now, this is the one I’m not sold on. I like Mark Addy, but I don’t know if he has the presence to play King Robert. Robert is described as a massive man gone to fat, oversized in all his appetites and, ultimately, something of a blowhard and an asshole. But with a heart.
Jaime Lannister: Nikolaj Coster-Waldau.
I don’t know this actor, but wow … does he ever look the part.
Tyrion Lannister: Peter Dinklage.
Of course, Peter Dinklage. It’s a bit of a shame that an actor as talented as him should of necessity be limited to the dwarf role in whatever he does … but he’ll be amazing as Tyrion. Tyrion is such a great character, too—Dinklage should really be able to get his teeth into this role.
Daenarys Targaryen: Tamzin Merchant.
Again, not someone I’m very familiar with—she plays Katherine Howard on The Tudors, but she really looks the part.
Theon Greyjoy: Alfie Allen.
And lastly, Ser Jorah Mormont: Iain Glen.
Excited yet, fellow Ice & Fire fans? Further updates as events require.
Monday, August 24, 2009
I’ve had a nasty cough dogging me the past few days, the kind that’s powerful enough to give you the same sort of headache you get from shaking your head back and forth vigorously. So I went out and bought some cough syrup, and yesterday as I sat at my office computer I saw it sitting there; the following bit of West Wing dialogue dropped into my head.
SPEC. AGENT CASPER: Cough medicine with tractor starter fluid strained through a coffee filter is methamphetamine.
PRESIDENT BARTLET: Tractor starter fluid doesn't kill you?
CASPER: No, it'll definitely kill you, but first you'll get pretty high.
This led me in turn to the episode of Friday Night Lights (yes, I watch a lot of teevee) in which Tim Riggins realizes that his skeety roommate is actually a meth dealer, and the cough medication he’s been getting Riggins to buy for him is one of the key ingredients.
And then from there to the idle thought as I looked at my bottle of Benylin, “Well, if only I had some tractor starter fluid …”
The white-water rapids that are the stream of my consciousness then turned it into a philosophical and ethical question between personal morality and societal prohibition. On the one hand, I thought, I would have no specifically moral qualms about growing and selling small amounts of marijuana to friends and acquaintances, as I firmly believe that weed is a relatively harmless social drug—much less so, for example, than alcohol. The reason I don’t do so (besides laziness and an inveterate brown thumb) is that I would fear arrest or other legal punitive measures. So my hypothetical career as a small-time pot dealer founders on the reef of the law.
Conversely, even if I did have some tractor starter fluid on hand, I would never consider making or selling a drug like crystal meth, for the simple reason that it is a deeply harmful drug and I find its dissemination morally repugnant. There is no need for the social prohibition in this case, as my personal morality preempts even the thought.
This little unbidden thought experiment (which unfolded in all of about thirty seconds as I paused with my hands above my laptop keyboard) then opened up in my mind the various issues at stake in the current state of laws surrounding marijuana’s legality and lack thereof. I personally see the incremental decriminalization of pot in this country as a progressive and positive thing—the stigma attached to marijuana for the better part of the twentieth century, largely an American import, is the height of irrationality. We now fortunately have a much better sense of how it differs qualitatively from narcotics, and the benefits it offers in some medical contexts.
The problem with marijuana from an ethical and moral perspective is not its effects on users, but where it comes from. One of the upshots of its deeply entrenched illegality for the last half-century is that it offers big money to organized crime, and large-scale grow ops have become a suburban plague.
I never fully appreciated the pervasiveness of this until recently when my brother and sister-in-law went on the market for a new house. One of the first ones they saw, an early favourite, seemed priced oddly low for its size and location. It had been totally renovated, and the pictures of the renovation provided to prospective buyers showed that the walls had been taken down to the studs and replaced, as had been the ceilings and floors. It was then that the penny dropped for my father, who realized that the house had been a grow operation.
Weirdly enough, one of the inquiries about my brother's house asked a series of bizarre questions that made sense when you realized those making them were themselves looking to buy a house for a grow op.
The damage done to houses by large-scale marijuana cultivation is catastrophic. This is from a website for an indoor environmental testing company in Ontario:
“These homes or industrial units are operated at a minimum of 27 degrees celsius with a sustained relative humidity of 80% or higher. The end result is an excessive amount of mould growth often hidden inside wall or ceiling systems. Extreme humid conditions cause extensive mould growth throughout these buildings. Mould growth resulting from these conditions are considered extremely hazardous due to their toxigenic nature. Species of mould growth found in these buildings in most cases posses mycotoxins which can be extremely hazardous and life threatening for anyone who enters these structures. In many cases mould growth is growing inside wall cavities out of sight without any indication of a problem or any visible signs of moisture damage.”
What happens is a house is purchased with a down payment anywhere between $20-$50K, depending on the value of the house, through shell companies which then pay the mortgage for six months to a year. Considering that the yield at the end of that period ranges from hundreds of thousands to over a million dollars, the investment is minimal. After several harvests, the buyers just walk away, leaving a ruined house in their wake that will never, however much renovation is done, reclaim its pre-grow op value because the damage can be so pervasive.
There is apparently a pot drought in Newfoundland right now. The police seized a massive shipment coming in, and as a result pot-smokers everywhere here are suffering weed privation. Upon being told of this, I joked that I should scatter some seeds in the unruly rear section of my backyard, which I have not trimmed back this summer, and cultivate small amounts of marijuana to sell to friends and acquaintances and supplement my income. Of course, I am not about to do this (and the way you know I’m not is because I’m musing about it on a blog—so if the constabulary happens to read this, I ask you to keep that in mind. Though if you do want to come and inspect my backyard for cannabis, feel free. Just watch out for the spiders).
On the other hand however, this makes me wonder if one way to reconcile pot decriminalization with the problem of organized crime’s predations would be with—pardon the pun—just this sort of grass roots approach. Legalization advocates point out that the regulation and taxation of marijuana as a cash crop could be a financial boon to government; I find it hard to imagine that we won’t arrive at that point eventually, but it will likely take a long time, and in the interim organized crime will continue to destroy houses (to say nothing of the concomitant violence accompanying large-scale drug trafficking). In the meantime, what about making small personal gardens of maybe a half dozen plants legal? The hypothetical scenario in which I cultivate a few plants and provide for a small circle of people would have the minor but very real effect of taking that business away from the big grow ops. Spread that out on a broader scale of people planting cannabis alongside their hydrangeas and heirloom tomatoes, and that effect could be profound.
Incidentally, since drafting this post I watched the episode of The West Wing mentioning the ingredients of meth, and as it happens I got it wrong: Special Agent Casper says allergy medication, not cough medicine, is what is used. That this entire sequence of thought began with me misremembering a key detail seems eminently appropriate.
Friday, August 21, 2009
It's time to circle the wagons, use the majority, and beat the Blue Dogs with a hose until they toe the line. Where's Rahm "attack dog" Emanuel in all this?
I'm now linking to Hendrik Hertzberg's blog at The New Yorker, because I'm reasonably certain that everything he says is right.
A colleague of mine sent around this article from The Chronicle of Higher Education this morning, and I'm of two minds on it. On one hand it makes some excellent points about what in humanities academia has come to go by the rubric "professionalization" -- in short, a premium put on research and publishing, with the emphasis on volume and a hierarchy of value with peer-reviewed journals and academic presses at the top. What has resulted, the article argues, is a dizzying increase in scholarly publishing with a concomitant reduction in reading audience for this scholarship.
On one hand, I tend to agree that the elevation of peer-reviewed writing to the level of the sacrosanct tends to make literary scholarship increasingly specialized and obfuscatory. I further tend to think that professors have a responsibility to the larger community as public intellectuals, but that kind of writing and publishing doesn't happen when one is under the gun to get as many articles in peer-reviewed journals published as possible come tenure review time.
On the other hand, the article makes a few false distinctions. The question of "audience" for peer-reviewed work is a bit of a fallacy, because that's a bit beside the point. Speaking personally, my own motivation for research and writing and (ultimately, hopefully) publishing -- besides the carrot/stick of tenure and promotion -- is my own interest in my subject matter, and the desire to keep sharp for the sake of my students.
There's also a a false binary in the article between "interpretive" (which seems to be synonymous with New Criticism) and "performative" (which I take to mean "theoretical") reading. What about historcist scholarship? Or more sociological approaches? Or the simple fact that literature's relevance changes depending on the realities of our given historical moment? That is, after all, why we keep returning to Hamlet.
Thursday, August 20, 2009
Wednesday, August 19, 2009
What prompts this post is not those texts per se, but rather the fact that in responding to them and formulating my critique, I found myself offering a fairly lengthy religious autobiography—in part largely because it is a subject that tends to make one want to lay one’s cards on the table and share one’s experiences and beliefs past and present. Why this is the case with religious discussion I am not certain—perhaps because matters of faith tend to be intensely personal and that inflects any broader discussion of religion in culture. It’s hard to say.
The long and short of it is, after writing out an account of my own engagements and struggles with matters of religion, I read it over and wondered if this was something I really wanted to post to my blog. On the one hand, I don’t mind sharing, and it feels somewhat cathartic to write it all through. Should it spark an animated discussion, so much the better. On the other hand, part of me is slightly appalled that I am so blasé about putting what is really a deeply personal narrative out into the public sphere. Is it narcissistic to do so? Leaving aside for the moment the fact that I don’t exactly have a wide readership for this blog, and that most people who would actually read through what I have to say (or who have read this far in this post) already know me pretty well, I do wonder at my compulsion at times to post elements of my life into a forum anyone can read.
With these thoughts in mind, I’ve gone back and read old posts and realized that while tempted during periods of blogging inactivity to discontinue this blog, I have never done so because I have grown very attached to this. It has become something of an autobiography or memoir, erratic and fractured in nature—the posts don’t unfold a narrative of my life so much as a grab-bag of thoughts, meditations, screeds, and updates—but ultimately something I am glad to have started, and something that has proved a more reliable site and source for my personal history than any private journal I’ve ever started (indeed, my journal more often than not these days contains notes toward blog posts).
And yes, I suppose it is a bit narcissistic to publish on oneself in a public manner. But then, I became reconciled to my narcissism some time ago, and in so doing hope to regulate it somewhat. It is, to be fair, a bit of an occupational hazard.
Posts on the “god debate” to come. Account of personal religious odyssey possibly included.
Saturday, August 15, 2009
Thursday, August 13, 2009
I say that with my tongue firmly in my cheek of course, but the ongoing saga of anti-reform mobs—and yes, “mobs” is the proper term here—pre-empting discussion and dialogue at the behest of lobbyists in the employ of the health insurance industry, to say nothing of the egregious disinformation being disseminated, represents a depressing nadir in U.S. political discourse.
But I’m off topic. It was the issue of guns that sat me down to post this morning. More specifically, it was the bizarre fact that all of the guns on display in the cases outlined by Ms. Collins were entirely legal.
I find it interesting that those most adamant defenders of the second amendment tend to share political ground with those who favour a “strict constructionist” interpretation of the constitution—namely, the belief that the constitution should not be re-interpreted to suit new historical contexts, but should hew as closely as possible to what the Founding Fathers intended when drafting it.
Now, speaking as an English professor, I think this entire approach is a bit wonky. Interpreting a text on the speculation of what was in its authors’ heads is an invitation to fallacy. On the other hand, I’m willing to grant that there is a lot of daylight between matters literary and legal. That being said, I’m not sure how a strict constructionist approach to the second amendment translates into “guns for everyone!” It reads, “A well regulated militia being necessary to the security of a free State, the right of the People to keep and bear arms shall not be infringed.”
Here, a little historical context is helpful. The Founding Fathers were deeply suspicious of professional armies such as the British redcoats, seeing them as the tools of tyranny. The original idea was that the U.S. should not have a professional army, but a people’s militia. To that end, it was the citizenry’s obligation to have weapons on hand to prevent (to quote The Simpsons) King George from coming into your living room to push you around.
Well and good. Of course, that particular prejudice against a professional military class didn’t last, and today the U.S. spends as much on its defense budget as the rest of the world combined.
See, if I was being a strict constructionist, I would parse the amendment as follows: the first clause regarding the “well regulated militia” (key word regulated) that is necessary for the “security of a free state” (i.e. national security resides in this militia) modifies the main clause “the right of the People to keep and bear arms shall not be infringed.” Gun-rights advocates tend to decontextualize the main clause, leaving out the modifying clause—which ultimately stipulates that the right to bear arms is predicated on being a militia reservist.
The fact of the professional U.S. military sort of derails a strict constructionist reading of this amendment. I do however have a solution—two solutions, actually.
Seeing as how the second amendment implies the illegality of a professional military, it should be dissolved and everyone gets to stockpile as many guns as they want. Failing that option, in exchange for surrendering their firearms, every American will be issued one Brown Bess musket—which was undoubtedly what the Founders had in mind when they spoke of “arms”—to have on hand should King George ever feel like throwing his weight around again.
Wednesday, August 12, 2009
Monday, August 10, 2009
One of the upshots of this is that I find myself reading chapters out of novels I haven’t read. Sometimes I am unimpressed, but more often it is just enough for me to want to read the whole novel. I have, in the two times I have taught the course thus far, been led to some excellent reads, like Christopher Moore’s Lamb and David Adams Richards Mercy Among the Children. But perhaps the biggest gold mine—or addiction—has been with the Discworld novels by Terry Pratchett. This past winter, two students gave me chapters of his work—one from Small Gods, and other from Wyrd Sisters—and I have since then read both those two novels and another ten on top of that.
I’ve been familiar or aware of Terry Pratchett for some time now (it’s hard not to be, if for no other reason than he’s penned about thirty novels since the early 80s and has practically a shelf all to himself in the fantasy section), having read Good Omens, a novel he co-authored with Neil Gaiman. I’d kept Pratchett at arm’s length since then despite loving Good Omens, or rather because I loved Good Omens—with that many novels on the shelves, it could easily devolve into a literary addiction, and lord knows I’ve had enough of those. But then came this past winter and my sampling of two chapters of his work, and the rest is history.
Perhaps this isn’t technically “summer” reading, given that this Pratchett spree started in February, but I’ve read enough of them since May to qualify.
So what is so appealing about these novels? To the uninitiated, the “Discworld,” where these stories take place, is ... well, let Pratchett describe it. From Wyrd Sisters:
“Through the fathomless deeps of space swims the star turtle Great A’Tuin, bearing on its back the four giant elephants who carry on their shoulders the mass of the Discworld. A tiny sun and moon spin about them, on a complicated orbit to induce seasons, so probably nowhere else in the multiverse is it sometimes necessary for an elephant to cock a leg to allow the sun to go past.”
The Discworld, the novels suggest, is a whimsical invention of a Creator who “got bored with all the usual business of axial inclination, albedos and rotational velocities, and decided to have a bit of fun for once.” What this imaginative coup allows is for Pratchett to people his world with everything and everyone, from humans to dwarfs, trolls, imps, vampires, werewolves, golems, wizards, witches, as well as a host of different human cultures and races that resemble those of our own world—and it is in the meeting and clash of these many groups that a great number of the novels find their dramatic tension.
While the novels range all over this capacious world, many of them center in the sprawling city of Ankh-Morpork (a city typical of Pratchett’s playfulness with naming both places and people). Ankh-Morpork is a primary point of contact for these multitudinous peoples and races, and is as such rather deliberately evocative of post-colonial London. It is also emblematic of what I find most appealing about Pratchett’s novels, besides his brilliant sense of humour—all his novels (or those I have read so far) tend to embody a sort of low-key philosophy of pragmatism. The whimsically fantastical Discworld and its often benighted or absurd denizens belie an author with a keen eye for human foibles and a strong sense of irony as a form of natural law.* Pratchett, one intuits, is an anti-absolutist, someone for whom dogma of any species is anathema to a balanced and equitable society. One should not take the presence of gods and such timeless characters as Death (who makes at least a cameo in all the novels) as evidence of a belief in the transcendent, either: the gods inhabiting the Discworld resemble nothing other than the Greek pantheon of competitive, jealous and petty deities. And in Pratchett’s world, the power and stature of gods is entirely determined by the amount of belief people have in them. The gods, in other words, are dependent on their worshippers, and not vice versa.
Like much great fantasy, Pratchett’s characters and situations have much to say about who we are, especially in the fetish of small differences that tends to excite prejudice and bigotry. That those “small differences” seem exaggerated by the contrast between the various species inhabiting the Discworld is often little more than comic misdirection. This is not to say that the novels offer a simplistic or cloying moral that underneath we’re really all the same—far from it. Often the resolution of differences is little more than the adaptation to living with our own pettiness. One of the characters who most embodies Pratchett’s pragmatism is the effective dictator of Ankh-Morpork, the shrewd Lord Vetinari. Vetinari is sort of the ultimate benevolent dictator: everything he does is for the better of his city, and his entire strategy is an elaborate balancing act in which he maintains the common weal not by ruling through fiat but by putting individuals into situations in which they act (a) in their own best interests, (b) according to their conscience, (c) out of a sense of honour or bravery, (d) out of cowardice, (e) out of avarice or personal gain, but in so doing carry out Vetinari’s wishes while imagining it was their own idea.
I tend to read Vetinari as Pratchett’s response to the old adage that the benevolent dictator is the ideal form of government (ruling in the common interest, but with the dictatorial power to get things done with dispatch). Vetinari is both a vindication of this claim and an expression of its vacuity: he carefully avoids the dictatorial power to “get things done,” knowing how that would disrupt the delicate political balance; and he is such a singular figure, one cannot help when reading his storylines what will happen to Ankh-Morpork when he is gone.
One of the problems with writing about Pratchett is how prolific he has been: I’ve read twelve of his novels, and that isn’t even the halfway mark (he’s up somewhere over thirty). The books abound with so many little comic quirks and asides, so many ingenious moments of imaginative invention, and what is really (to a new reader) an ongoing internal logic to the Discworld that becomes more comprehensive with each read, that it is difficult to offer specifics. This much I’ll say: if you like that uniquely British sense of humour such as one finds in Douglas Adams; if you are a fantasy fan; if you like crime fiction (because, interestingly, so many of these novels end up being narratives of getting to the bottom of something); and if you like novels that are almost invariably non-formulaic and surprising—I think you’ll find yourself quite as addicted as me.
So be warned.
*I must give credit for the phrase “irony as a strong form of natural law” to my friend Gregg Taylor; he used it in an episode of his online radio drama Black Jack Justice (episode 20, “Sabien’s Law”). While I would normally be loath to steal phraseology, this one was simply too apt for Pratchett’s sensibilities. Considering that the phrase is used to describe a certain police lieutenant and that one of Pratchett’s most endearing recurring characters is the hard-bitten and pragmatic watchman Samuel Vimes, this should not perhaps be a surprising coincidence.
Sunday, August 09, 2009
This comment has been rattling around with me lately, because I go through periods when I get a surfeit of story ideas. These get jotted down in a notebook, perhaps with some plot points fleshed out or even some exploratory prose, but rarely anything beyond that. Story ideas are like tunes stuck in my head—they drive me crazy for a week or so, during which time some writing happens, but then they tend to be replaced by other things and I lose interest. This might not be an issue if I had ideas for short stories, for which I could conceivably bang out a draft, but alas I seem to think on a novelistic scale.
This is not to say I don’t return to these ideas and flesh them out a bit further at times, or that I don’t one day hope to actually make it all the way to the end of one of them and get something into print. But whatever people might think, writing fiction is really hard work, and it’s even harder to do it well (a lack of talent on my part may well also be a contributing factor—time will tell, perhaps). With the bulk of my time devoted to my, you know, day job, writing creatively becomes more of an occasional hobby for my own amusement.
Which is why Borges’ notion of the imaginary book review appeals to me. I’ve been toying with the idea of starting a second blog to outline and “review” these novels that bang around in my skull. Considering the number of ideas I’ve had, contrasted with the productive time I could spend on them, all to the power of my basic laziness, it’s pretty much a mathematical impossibility that I could write them all in my lifetime—I’d have to be Philip K. Dick or Philip Roth. And, well, Dick was crazy, and Roth strikes me as a bit of a dick. To say nothing of the fact that as abstract ideas or a couple of pages of point form notes, it’s hard to say whether they’d even be viable ideas.
At the same time, I do like many of my ideas, and wouldn’t mind sharing them.
Of course, there’s also the times when I read or watch something and see that I’ve been scooped. Ever since the first time I watched Independence Day, I’ve wanted to write something that would be a corrective to that film’s hokey and triumphalist utopian ending. The implied future the film leaves us with is one in which, having come together under American leadership to defeat the aliens, the human race looks forward to having all its rifts and conflicts resolved.
I imagined a dystopian novel set some eighty years after the victory (over an entirely different alien enemy, of course—I wouldn’t want Roland Emmerich demanding a percentage), humanity, much reduced in numbers by the alien attacks, resides in a series of fractured and contested geographical alliances. The feel-good honeymoon after the victory soon eroded into a race by vestigial nation-states to plunder the alien technology and thus gain advantage, militarily and otherwise. The asymmetry of technology has resulted in several extremely powerful groups, which exact tribute from neighbouring impoverished demographies for protection. Meanwhile, what aliens survived have been imprisoned and forced to educate humans about their technology.
I’d always liked this idea. And then the other day, I saw a trailer for Peter Jackson’s new film District 9. The premise seems to be—at least in part—that an alien species has arrived on Earth and have been effectively imprisoned. While the film’s broader theme is obviously about racism and the mistreatment of refugees, the trailer makes it clear that a big reason for their imprisonment is to force them to give up their technological secrets.
So, really quite different on the whole from my idea. But I still feel scooped.