This past weekend I bought Joseph Anton, Salman Rushdie’s memoir of the years he spent in hiding under threat of Khomeini’s fatwa. In spite of how busy I am right now, it has proved a difficult book to put down, and I’m almost two hundred pages in (if I wasn’t currently teaching three classes, I have little doubt I’d be finished already). It’s really just that good—enthralling, engaging, and harrowing. I will blog at greater length about it in the future. But for now, I just want to share a snippet I love.
Besides its narrative of hiding from extremists, there’s an awful lot of inside baseball about the literary scenes on both sides of the Atlantic, with all sorts of familiar names surfacing—Martin Amis, Ian McEwan, Angela Carter, Susan Sontag, Nadine Gordimer … the list goes on and on, to the point where a lesser author might be accused of name-dropping. But then, this is a book by one of the biggest names of all, so I suppose we shouldn’t get snarky. One of the lovely aspects of this book is the way Rushdie weaves it all together, and in the process tells the story of how some of the greatest novels of the latter twentieth century (i.e. his) got written.
And how publishing works. When The Satanic Verses was put up for auction among publishing houses in the U.S. and Britain, he had a hard choice to make when it became clear that he stood to make more money that he’d ever imagined. His longtime editor at Jonathan Cape, Liz Calder, had recently left her company to start up Bloomsbury Publishing, and it was assumed that she would publish Rushdie’s new novel when he was done. Indeed, Calder and Rushdie’s British agent Deborah Rogers had made just such an arrangement. But when it became clear how much money was being offered as an advance in the U.S. for The Satanic Verses, Rushdie’s American agent warned him that the modest advance Bloomsbury could offer would queer the deal. So after much soul-searching, he changed publishers and agencies in Britain, in the process damaging two deep friendships.
After the fatwa, however, both Liz Calder and Deborah Rogers put aside their hurt and resolutely rallied to Rushdie’s side—staunch friends in a time when it seemed that everyone else in the world was keen to stab him in the back.
And things worked out beautifully for Liz Calder, as it happened:
Liz came to feel that she had dodged a bullet. If she had published The Satanic Verses, the ensuing crisis, with its bomb threats, death threats, security expenses, building evacuations and fear would very probably have sunk her new publishing venture right away, and Bloomsbury would never have survived to discover an obscure, unpublished children’s author called Jo Rowling.
When I’ve fallen out of blogging for a long time, it gets harder and harder to get back into, even as I feel a vague sense of guilt for ignoring this blog for so long. But then something happens, or I read something, which either delights or infuriates me enough that I feel compelled to return to the blogosphere.
This post falls into the “infuriated” category.
I’ve posted a few times in the past about my antipathy to Margaret Wente’s column in the Globe and Mail, both for her frequent anti-academic screeds and for her generally inept and misleading argumentation. So when accusations of plagiarism on her part came to light recently, I followed the story with about as much schadenfreude as you might expect, though it was tempered by the tepid response in mainstream Canadian media … a response that was puzzling, considering that these were not vague smears but well-documented instances of Wente basically cutting and pasting from other sources. That she plagiarized was not in question, as much as Sylvia Stead, the Globe’s public editor, tried to soft-pedal it. The point of contention became whether this was really as big a deal as people were making it out to be.
Speaking as an academic? It’s a big deal. Say what you will about the U.S. print media—and I could say a lot—but they deal with journalistic malfeasance with extreme prejudice. Jason Blair became a pariah; Jonah Leher has basically had his very existence scrubbed from Amazon’s lists; and now Fareed Zakariah is under the gun. To be fair, this isn’t to suggest that all sins against journalistic ethics go un-punished (still waiting for Judith Miller’s virtual exile), but there’s at least an overriding sense that intellectual dishonesty is a very serious offense.
Slowly, slowly, mainstream organs have started to respond. MacLean’s published an editorial condemning Wente; the National Post then followed suit, and CBC Radio issued a release stating that she would no longer be a frequent panelist on Jian Ghomeshi’s show Q. But then this morning I came across an editorial in the National Post by Terrence Corcoran, who defended Wente and attacked her detractors as “self-righteous, self-important, self-aggrandizing competitors” and “dreary dictatorial avatars of pretentious rules and political correctness.”
Political correctness? Really? Pretentious rules? Once upon a time, conservative thinkers were reliable guardians of such “pretentious” rules as intellectual honesty and scholarly rigor, and they lambasted liberal and left-leaning thinkers for such “postmodernist” offenses as pastiche and moral relativism; certainly Wente has sought to establish her editorial bona fides by way of constant pseudo-contrarian attacks on liberal groupthink and academic political correctness. Which I have to imagine is why Mr. Corcoran then attacks Wente’s attackers as foot soldiers of the PC thought police. For the record: her professionalism is not being questioned for critiquing multiculturalism, defending Rob Ford’s incompetence, attacking universities’ academic standards, mocking public transit, vilifying organic food, disdaining the Occupy movement, characterizing Newfoundlanders as lazy welfare addicts, accusing professors of being overpaid and underworked, or any of the dozens of columns she has written for the express purpose of poking us Liberal Elite with a sharp stick.
No: her professionalism is being questioned here for the very specific reason that using someone else’s words as your own is plagiarism. Full stop. As any undergraduate student knows, there are very specific rules determining this, and rather severe penalties for those who transgress, starting with a zero on the assignment, progression through a zero in the course, and potentially culminating in expulsion. Considering how frequently Ms. Wente has written about the slipping standards at Canadian universities, one might imagine the rigour with which we set such rules would be on her mind. (As a friend of mine suggested, perhaps Wente would consider availing herself of one of the academics she has vilified to educate her on the definitions of plagiarism.)
I honestly had to read Mr. Corcoran’s editorial twice to be certain of its seriousness, and double-check the URL to make sure I hadn’t been sneakily redirected to The Onion. Basically, the gist of the argument is as follows: Ms. Wente has been victimized by haters and rivals, and the whole issue of “plagiarism” is a trumped-up charge by the PC crowd, something really hardly worth mentioning because all she did was leave out some scare quotes. We all make mistakes, not that this was a mistake, but if it was it was entirely understandable and hardly reprehensible. Oh, and the post of “public editor” is totally an infringement on the rights of journalists to write whatever the hell they want, and probably a contravention of free speech.
Seriously. Here’s his argument:
Newspapers and journalism in general, once bastions of press freedom, are now under the thumb of throngs of second-rate moralizing “experts” and outsiders who like their press freedom tightly controlled and monitored. There’s nothing wrong with criticizing writers, but there is a problem when outsiders can use artificial structures to suppress and control those writers.
I’m not entirely certain how you progress from a question of intellectual dishonesty to Big Brother. Considering the fact that the allegations against Wente are three years old and only now gaining national attention, and would not likely have come to light at all were it not for the “anonymous blogger” Professor Carol Wainio and her blog Media Culpa, it’s a bit of a stretch to suggest that Ms. Wente is the victim of totalitarian thought police. And if you think my characterization there is an egregious overstatement of Mr. Corcoran’s words, here’s his concluding sentence:
Ms. Wente, I suspect, now knows something of what it felt like during the Cultural Revolution in China, when ideological enforcers roamed the country to impose their views and expose running-dogs, remove people from their jobs and purge them from the system.
Yes. Poor Margaret, dragged from her home for her counter-revolutionary sentiments and sent to a brutal reeducation camp, rather than being privately censured and kept on at her job. (Then again, she was barred from Q, which may be punishment or reward).
But to return to the editorial: Mr. Corcoran’s sneering characterization of “outsiders” who have the temerity to (gasp!) have an informed opinion is shamefully anti-democratic. Given that Mr. Corcoran is a business writer, one would assume that the issue of intellectual property is something of a going concern for him. Or perhaps he is something of a radical on this front? Did he advocate, once upon a time, for such companies as Napster and their right to flout copyright? Does he scoff at anti-piracy commercials preceding movies? If so, this column would have been an ideal place to assert his quasi-Marxist ideals—it would have made his defense of plagiarism more coherent.
Then again, he does seem to believe plagiarism is a relatively new idea, as he asserts that Ms. Wente’s “major alleged crime against journalism was to fail to put quotation marks around somebody else’s words, something that is now defined in the blogosphere as plagiarism.”
Let that sink in a moment. Reread that sentence. And read it again. And then go back to his editorial to assure yourself that I got it right. “Now defined in the blogosphere as plagiarism.” Now. As in recently. In the blogosphere. Among the best responses to this editorial I have heard was someone characterizing it as trolling in a major newspaper, because really, the claim that the accusation of plagiarism is somehow a new thing invented by liberal bloggers to smear a respected columnist? … well, I just don’t know where to start.
So I won’t. I will leave it there, and hope that anyone else with half a brain recognizes the sheer absurdity and idiocy of that statement.
In the end, this “scandal” (which at this point really needs to be in scare quotes, because any real censure for Ms. Wente’s theft remains undelivered) is shocking but utterly unsurprising. Unsurprising because it is the logical end-point of Ms. Wente’s particular argumentative strategy. Among the defenses I have heard for her plagiarism is the plaintive “But she writes three columns a week!” Yes, at about one thousand words or so per column, she has quite the prolific output. It is perhaps serendipitous that I’m currently working my way through Christopher Hitchens’ Arguably, his last collection of essays published before his death. Oddly, it makes me even less sympathetic to Ms. Wente’s plight … not that every journalist and pundit should be held to Hitchens’ standard, but somehow reading Hitch’s often infuriating but always brilliant arguments throws Wente’s practice of vilification by way of cherry-picking into stark relief. Her strategy is lazy enough, intellectually speaking—every other column of hers I read is a paraphrase of some book she happens to be reading—that it should come as no surprise that she slipped up and forgot to throw up some quotation marks around a passage or three.
But does that make it forgivable? No. No, in thunder. The defenses of Ms. Wente’s transgression range from the absurd (see Corcoran, Terrence, above) to the disingenuous … and while it was the former that prompted me to post this, on reflection it is the latter that is most dangerous. Ms. Wente has written endlessly, and stridently, about the loosening of standards, the slipping of intellectual rigor, the “everyone wins” ethos adopted by the education system, and above all the need for those who fail to suffer consequences.
Time for her to put her principles where her mouth is.
**********
UPDATE: Carol Wainio responds to Mr. Corcoran's editorial here.
On the heels of my last post ... FIVE years ago today, I became an uncle twice over, because apparently my brother and sister-in-law are concerned with buying the kids' birthday presents in bulk.
Yes, five years ago today, my nephew Zachary was born. For the duration of his infancy, he evinced a rather suspicious attitude to the world, as shown in this picture of him taken at my cousin Jen's wedding:
"These people know the Macarena is so ten years ago, right?"
And he has since then grown into a very serious and focused child, one who will wave off all help on a jigsaw puzzle because he's just going to get it, dammit! And he does. But he shares his sister's mischievous tendencies, and has a grin that would charm Voldemort.
To wit, a story my brother once told me that kind of sums up Zach: Matt, who runs triathlons, had gone with the little guy to Runner's Room to get ... oh, I don't know, whatever psychotic people who run triathlons buy at high-end sporting goods stores. There was a women's running clinic on, so Matt discreetly made his way around them, but Zach walked right up in front of the audience. For a few seconds he stared at them very solemnly. And as they all melted at his cuteness, he started to dance.
Be afraid, ladies. Be very afraid.
Anyway ... Happy birthday, Zachary. Your Uncle Chris is ... crushing your head!*
Seriously. Imagine this at the age of twenty-five.
Not sold on the whole fishing thing yet.
"Why, yes ... I do in fact have radioactive blood. Why do you ask?"
*For those unfamiliar with The Kids in the Hall: this is not a weird psychotic non sequitur.
Seven years ago today, on the eve of leaving Ontario for my new life in St. John's, I essentially started this blog with a post announcing the joyous fact that I had become an uncle.
Seven years, and now that little pink loaf in a maternity ward has grown into a mischievous, feisty, fearless little girl whom I do not see nearly enough of and miss terribly. Morgan Emily Jean Lockett is a blond firecracker of energy who runs her parents and grandparents ragged with her endless inventiveness and the rather particular rules she has for playing (seriously: things are done just so when you play a game with Morgan).
So happy birthday, Miss Morgan, and know your uncle loves you and misses you.
I was about halfway through a review of the first episode of Aaron Sorkin’s new series The Newsroom, which premiered on HBO at the beginning of this week, when I realized I was just basically agreeing with the vast number of reviews out there—essentially, that the show (so far) is typical, tightly-written Sorkin fare, but which has gone unfortunately overboard on the sanctimony, has almost entirely started using straw man arguments, and the worst of his sexist tendencies have come to the fore. For the record, I really, really want to like the show—I love the rapid-fire rhythms of Sorkin’s dialogue and his unapologetic privileging of intelligence and book-learnin’, and The West Wing remains one of my favourite and most frequently rewatched shows—but I do have to agree with most of the criticisms I’ve read (here's a handy round-up) … at this point it feels like he’s repeating himself, and not in a good way. And the now-notorious interview with Sarah Nicole Prickett in last Sunday’s Globe and Mail has focused a lot of anger and snark on Sorkin’s borderline* misogynist arrogance and spawned a Tumblr meme in ironic imitation of Feminist Ryan Gosling.
Also, I’ve been cringing a lot lately over Sorkin’s comments in the media. Prickett’s interview was just the worst of the lot … it feels now as if he’s become a legend in his own mind and has forgotten that the other half of “telling truth to stupid” (one of the worse lines from the pilot) is yourself being receptive and open to new ideas and perspectives, and being always willing to change your mind. The beautifully scripted arguments of The West Wing seem more or less absent in The Newsroom, replaced with people just lecturing each other.
All that being said, I’m still going to watch this season of The Newsroom … I’m keeping my fingers crossed that what I saw is just the early kinks of the series, which will get ironed out as it goes and settles into a better rhythm. There were enough lovely moments—unfortunately overshadowed by the sanctimonious speechifying—to give me hope.
Plus, I just love watching Sam Waterston. He was always the best part of Law & Order. Eyebrow acting like his is a rare thing.
So I won’t review the show beyond that … but it did make me revisit a theme that came up several times in my grad class on HBO this past winter, so I’ll talk about that instead.
But one more observation, which may be neither here nor there. This is Sorkin’s fourth series after Sports Night, The West Wing, and Studio 60 on the Sunset Strip; what the ill-fated Studio 60 and The Newsroom have in common is that they begin with a moment of crisis and rebellion, in which a broken system is revitalized by an injection of intelligence, doughty contrarianism, and altruism. Significantly, the similar moment on The West Wing didn’t come until episode nineteen in season one, “Let Bartlet be Bartlet” (unfortunately, this video cannot be embedded).
By contrast, both Studio 60 and The Newsroom start with their Let Bartlet be Bartlet moments, which has the effect of setting up rather steep expectations. And while there’s a similar moment in the pilot of Sports Night, it’s rather low key by comparison, and in both of Sorkin’s first two series, he lets us ease into these incredibly busy and bustling worlds without raising the stakes on day one. I’m not saying that Studio 60 failed for that reason specifially, or that The Newsroom is similarly doomed … but it does feel that, having made his televisual reputation creating inspirational and uplifting series, that he cannot think small any more, and feels the need to launch his shows as one would a crusade.
I’ve been mulling over my reaction to The Newsroom all week, and I keep coming back to a pervasive aspect of HBO’s flagship dramas that sort of makes Aaron Sorkin an odd man out there. Let’s be clear: when it was first announced that Sorkin was creating a show for HBO, I rejoiced, as this was something I had long imagined as a marriage made in heaven. I’m now not so sure.
One of the most interesting things about HBO’s programming, specifically in regard to its dramatic series (half-hour comedies like Sex and the City, Entourage, or Curb Your Enthusiasm are a different thing altogether), is that they are overtly non-aspirational—which is to say, they (1) depict people, situations, and work/careers distinct from educational systems and apparatuses and the kind of accreditation that all entails; (2) are, further, overtly critical of the tacit idea that those institutions are an unproblematic path to societal happiness, and (3) reject utopian resolution.
The connection between education and utopia may seem an odd one to make, but I should be clear that “education” here is a cipher, essentially synonymous with upper-middle-class success in accredited professions. When one does a quick rundown of popular television from its inception, working class protagonists and narrative frames are the exception, even if they account for some of the more popular shows (e.g. The Honeymooners, All in the Family, Sanford and Son);* far more common are aspirational workplaces (law firms, television stations, hospitals) or families with professional parents (The Cosby Show, e.g.).
The implausibility of that, however, was entirely beside the point. Part of the pleasure of the show lay in the very depiction of that lifestyle as an unstated utopian promise.***
By contrast, an even halfhearted viewer of HBO and HBO-like television knows that it reverses this tendency (again, in its dramas—Sex and the City is nothing if not aspirational). Such dramas as The Wire or Deadwood actively resist and frustrate easy narrative closure and the cyclic rhythms of episodic stories. This much has been observed quite frequently and by more astute observers than me. What I find striking however is the way in which this has translated into a shift away from a focus on desirable upper-middle-class careers and professions to predominantly working-class or criminal contexts, in which native intelligence, intuition, and cunning are privileged over accredited training and education.
In other words: no doctors, no lawyers, no captains of industry or business professionals, no scientists, and above all, no happy families. Which is not to say that these people and professions don’t appear on HBO’s dramas—they’re just not the focus of the various shows. So, doctors Gloria Nathan and Jennifer Melfi can plan key roles on Oz or The Sopranos respectively, but in the end the shows are not about them; there are lawyers and judges aplenty on The Wire, as well as politicians and journalists and teachers, but The Wire is no legal drama any more than it is a show about politics or journalism.
"OK, when you say you 'had him whacked,' I'm just going to assume that 'he' is a
weed in your yard."
(It could be argued that The Wire complicates my argument, insofar as it does tend to be more of a workplace drama than the other series I’ve cited, and does feature what in other contexts would be aspirational professions. But these professions are overtly depicted as essentially tribal, less a matter of formal training and education than a set of innate qualities that make you, for example, “good po-lice” or not. In David Simon’s Baltimore, the principal conflict is between those of the first category, from police to politicians to teachers to reporters, and those of the second, who tend to be the ones running things).
Clockwise from top left: Deadwood, Oz, The Wire, Sons of Anarchy, Breaking
Bad, The Sopranos.
This observation doesn’t hold across the board—there are many exceptions—but the trend is marked enough to give pause, especially when one considers the sheer critical mass of legal and medical dramas, and their pervasive popularity.*** By contrast, such series as The Wire, Deadwood, Oz, The Sopranos, Treme, Carnivale, and, yes, Game of Thrones, as well as such non-HBO offerings as Breaking Bad, Sons of Anarchy and Justified**** take as their focus cultures and contexts in which education and accreditation are all but irrelevant. Intelligence, by contrast, is highly valued—especially intelligence that allows one to maneuver a shifting landscape of power and allegiances. In many of the shows cited above, the main players are pervasively working class, largely uneducated beyond high school (if that), but possess a native cunning and acuity. An excellent example of what I mean occurs in season three of The Wire (key sequence from 0:30-2:00):
But perhaps the most obvious example of what I mean is Tony Soprano: his rise through the ranks of the mafia has won him all the outward trappings of financial and societal success and allows him to live in an exclusive suburban community. The show is frequently at pains however to point out just how out of place he is there. When we encounter his neighbours, they are all accredited professionals in law, business, or medicine—precisely the kind of people we might expect to see in mainstream network aspirational narratives. They are ill at ease with Tony, both because they know his reputation, but also because he remains a working-class Jersey kid.
"Psychiatry and cunnilingus brought us to this."*
*Note to people who haven't watched The Sopranos: this is a hilarious reference.
There is a fun little irony at work with shows like The Sopranos and The Wire, insofar as they have come to be the television of the liberal intelligentsia, while in substance and content having no use for the educated and professional classes. In my favourite example of how HBO has come to be embraced by intellectuals, an ad in The New Yorker for season five of The Wire featured playwright Tony Kushner gushing about how much he loves the show, saying that “there’s so much to admire, it’s hard to be concise,” essentially giving permission to those who might otherwise be embarrassed to watch television.
By contrast, Aaron Sorkin’s shows (and films) are overtly aspirational—and more than that, they function as liberal intellectual fantasias, utopian depictions of how things would be if we genuinely had the best and the brightest running things. The Newsroom—the first episode, at least, though I’d be very surprised if the rest of the season proves different—is like a distillation of the traits and tendencies of his previous work. Intelligence is the brass ring of the Sorkin world (the Sorkinverse?), but intelligence of a very specific nature: one must be articulate and argumentative, with legions of facts and figures at one’s fingertips, and it must also be intelligence in the service of civic or political high-mindedness.
None of this, I want to hasten to add, is a bad thing in and of itself—certainly, as I’ve said, I love much of Sorkin’s work, and really, I have no issue with liberal utopias on television, certainly not if they’re really well written and acted. Which is one of the reasons why The Newsroom’s first episode was such a disappointment. As mentioned above—and as pointed out in pretty much every review I’ve read so far—The Newsroom doesn’t lack for argument or virtuoso displays of statistical literacy … what it does lack is even the sense of even-handedness that made The West Wing a joy to watch. Which is not to say that The West Wing was fair and balanced (to coin an expression), but it was rarely mean-spirited, and allowed for the principals to be wrong and have their minds changed. Par example:
By contrast, here’s the opening scene from The Newsroom, in which anchor Will McAvoy has his extremely articulate meltdown (the rant starts at 1:36):
To be fair, there’s a lot in this rant to admire, and it certainly articulates the frustration of people on both sides of the political spectrum with no patience for mealy-mouthed platitudes and/or patriotism, and who loathe dogmatic talking points of any political stripe (certainly, Will’s comments on the idiotically reductive use of “freedom” as a catch-all are thoughts I’ve frequently had). But after his rant, he descends into the kind of nostalgic “America was once so great” rhetoric that so desperately needs a whole whack of caveats … “We waged a war on poverty, not poor people (provided they were white)” would be, for example, a key amendment there. McAvoy’s nostalgia—and the show’s—for the days of Edward R. Murrow needs to be tempered with a dollop of awareness of the period’s systemic racism and misogyny.
OK, I’m starting to rehash the show’s early reviews, so I’ll get back to my principal argument. The bottom line, here, is that there is little to differentiate this show (so far) from something Sorkin might have created for network television. About the only difference we note, stylistically, from his previous three series, is the occasional dropping of the f-bomb. Perhaps I’m selective in the series I watch, but I’ve come to expect an awful lot more from HBO. Sorkin at this point seems painfully taken with his own sense of himself as a brilliant writer.
But he ain’t got nothin’ on David Simon. Just sayin’.
*On rereading this post before putting it up, I now think “borderline misogyny” is generous, and puts me in mind of a West Wing line:
BARTLET: “You know that line you’re not supposed to cross with the president?”
CJ: “I’m coming up to it?”
BARTLET: “Look behind you.”
Big sigh.
**The very big exception here is cop shows. See the footnote below for more on that.
***All of this is not, I hope it goes without saying, a hard and fast rule. Exceptions abound, and I’m not even venturing out of the realm of fictional television.
****Again, police procedurals are a slightly different case, as they privilege innate rather than learned intelligence, and field experience rather than education. But they are arguably the most utopian mainstream genre—aspirational not in the sense of offering an improved lifestyle, but in promising to maintain the societal equilibrium necessary for it.
*****A few notable exceptions: the Glenn Close legal drama Damages; Mad Men; HBO’s polygamist drama Big Love; but then, each of these “exceptions” are themselves difficult to unproblematically designate as aspirational shows.
This is the second installment in my series on contemporary fantasy.
When I teach my introductory course on literary theory and criticism, I always find occasion to talk about varieties of realism—the degrees of mimesis and verisimilitude in narrative, and how we distinguish between a “realistic” text versus genres such as fantasy, SF, or magical realism, and for that matter the differences between social and psychological realism, naturalism, and so on. One of the key points I like to start with is that it tends to be intuitive—we know a story is realistic because it feels realistic, and often our evaluation of a story’s quality proceeds from the basic sense we have of how “realistic” it feels—even when the story’s premise is overtly unrealistic, such as with Dracula or The Shining (to choose two examples more or less at random).
This “feeling” of realism comes down to the unspoken contract between a story and its audience—what we otherwise call the willing suspension of disbelief. When it comes to stories, we’re pretty willing to suspend a lot of disbelief, at least as far as the story’s basic premises go—accepting a whole host of impossibilities from ghosts and vampires, to magic and sorcery, to alien worlds and species. At the same time however, there are certain things we are less inclined to accept; as Aristotle pointed out, probable impossibilities are preferable to improbable possibilities* … and as Oscar Wilde concurred, “Man can believe the impossible, but man can never believe the improbable.” To put it another way: we cheerfully accept the idea of alternative realities, but reject out of hand a character acting contrary to what we’ve come to expect of them—so Aragorn attacking a legion of orcs is consistent with his character, whereas stealing money from the hobbits to support his secret drug addiction, not so much.
"The real question, Greedo? Do you feel lucky? Well, do ya? Punk?"
Or to cite a notorious example that exercises SF dorks like myself like nothing else: when George Lucas re-released Star Wars, the tweaked scene between Han Solo and the bounty hunter Greedo incited nothing less than fury in diehard fans. Why? Because in the original, Han shoots Greedo under the table with the casual insouciance of an amoral, roguish gunslinger. In the re-release, Greedo shoots first, and Han only survives to return fire because the bounty hunter is, apparently, an appallingly bad shot. All of which had fans doing everything short of taking to the barricades with torches and pitchforks.**
So think about this: spaceships, faster-than-light travel, lasers, aliens, and cosmic sorcerers are not the problem here. These are perfectly acceptable impossibilities.*** What is not acceptable are the violations of logic the revised scene perpetrates: the improbability that Greedo would get the drop on Han Solo and that a hard-bitten bounty hunter could miss at that range, but the worst is the violation of our understanding of Han’s character. We’ve only known him about ten minutes at this point, but all of the signs tell us that he is precisely the kind of guy who’d unhesitatingly shoot you under the table to save himself. The film might be SF, but the character is totally Eastwood.
The cognitive dissonance evinced by haters of the revised Han/Greedo scene (among whom I number myself) is an example of a violation of what I’m calling diegetic logic. “Diegesis” refers to the world or sphere of reality established by a given story, and everything that entails—in particular here, the logic and rules governing that world, that which makes it comprehensible. Fidelity to this logic also goes by the more familiar term believability … we believe Han Solo is a spaceship pilot; we do not believe he’d let Greedo get the drop on him. The former is part of the diegetic frame, the latter a flagrant inconsistency in character.****
Of course, the contract between story and audience is not exactly ironclad, and the rigor and consistency of diegetic logic depends on the genre in question. And if we were to apply the logic of realism to fairy-tales, for example, we again arrive in the realm of (often amusing) cognitive dissonance.
When I talk about this in class, the example I always like to give is the song “The Bonny Swans,” by Loreena McKennitt. It’s a ballad, adapted from a recurrent medieval folk tale about a girl murdered by her older sister, who returns as a harp to expose the sister’s crime.
Yep. A harp. Stay with me here. For those who would rather listen to the song itself:
The story has a host of variations, but the basics are as follows: the eldest daughter in a particular family is jealous of her younger, fairer, more beloved sibling. She is also in love with the young man to whom her sister is betrothed. So, one day when out walking beside the river, she pushes her sister in and drowns her (as one does when one is the jealous older sister in a folk tale).
"Seriously, the water's only two feet deep. Stand up, you idiot."
The drowned girl fetches up on shore far downstream, and her body is discovered by a harpist … who proceeds to turn her into a harp (again, as one does). The McKennitt song is not unusual in its macabre description of the process, outlining how the harpist uses her breast bone for the main bow of the harp, her hair for the strings, and her finger bones for the frets. The harpist then takes his new “harp” to the wedding ceremony of the murdered girl’s sister—as she is, of course, marrying the (presumably gormless) man to whom the harp/girl was betrothed. The harpist places the harp/girl in the middle of the hall, and it starts to sing. And in a moment that comes as a shock to no one, the harp’s song reveals the treacherous crime of the would-be bride.
That’s where McKennitt’s song ends. And as should be obvious from the way I related that story, the premise is absurd (I tried to tell it straight, and just couldn’t manage it). But then, that’s the fairy-tale standard. And McKennitt’s orchestration is sublime … but whenever I hear this song, I always imagine an epilogue in which the father of the bride dispenses justice to his murderous daughter, but then beckons over the harpist. The conversation goes something like this:
“So let me get this straight. You found my daughter’s body, and you turned her into a harp.”
“Yes, my lord. And as you saw, she sang of her sister’s—”
“I heard what she sang of. That’s not the problem.”
“I’m sorry … there’s a problem?”
“Yes. You see … you found my daughter’s body.”
“Yes, my lord.”
“You found my daughter’s body. And you turned her into a harp.”
“Yes, but she sang and revealed that her sister—”
“That doesn’t change the fact that you turned her into a fucking harp!”
[The harpist stares at the father, uncomprehending. The father continues.]
“A fucking harp! What kind of sick fuck comes across the corpse of a beautiful girl and thinks to himself ‘Oh, hey, a drowned girl! You know, I bet she’d make a bitching harp!”
[At this point, the father loses the capacity for speech and has his guards drag the harpist off to the dungeons, where he presumably awaits his guest-starring role on an episode of Law & Order SVU.]
This scenario amuses me endlessly, but it’s equally amusing to apply the same logic to, say, “Hansel and Gretel,” in which the titular children pause when they come upon a cottage made of gingerbread and candy in the depths of the forest, and think “Waaaaiiiit a moment.” Or Little Red Riding Hood is a little less credulous about Granny’s explanations for her whiskers and fangs. Or someone alerts the authorities, in exchange for a nice bounty, about the little man with the capability of spinning straw into gold (something I assume any government, modern or medieval, would have great interest in).
"In answer to your question, hate transformed me into a vindictive gremlin when I
realized I was the only Scottish actor not cast in Brave."
But of course that all contravenes the basic contract we make with fairy tales, in which we accept such leaps of physical and behavioural logic without question. Step away from your family cottage into the dark forest, and you have no idea what you’re going to encounter. And whatever you do encounter, however bizarre, you must just accept—that is the understanding we have with the genre. (It’s tempting to suggest that the logic of fairy tales is, in this respect, the logic of children, still open at a young age to all possibilities and bizarre eventualities, but anyone who has attempted to read fairy-tales to a precociously logical child knows that isn’t necessarily the case).
Fairy-tales follow a diegetic logic similar to that of romance—not the romance of Harlequin and Fabio, but traditional medieval romances and quest sagas, in which the departure for uncharted territory entails a departure from, variously, civilization, law, order, and most importantly, rationality. What happens in the white spaces of the map is anyone’s guess. Hence the various Arthurian romances, especially the quest narratives, tend to be populated with bizarre people and creatures, magic, and individuals with inscrutable reasons for what they do … such as, I don’t know, a black-clad knight guarding a river crossing.
"Well technically, one way or another all wounds are flesh wounds."
But Chris, you say … such figures and circumstances, whether it’s a black knight or a gingerbread house, are invariably symbolic. Reading them with any degree of literality completely defeats the purpose, and indeed, the people who wrote these stories and sagas didn’t mean for us to read them as realistic.
Well spotted, hypothetical interlocutor, well spotted. And that is all very true: the modern bias toward verisimilitude often makes it difficult to properly historicize the original intents of such stories. Applying the rules of realism to a fairy-tale is about as misguided as thinking (for example) that the Bible is meant to be taken as literal, historical fact. And really, that’s just silly talk …
There are those who tell me I shouldn't mock Christian fundamentalism.
Actually, wait ... no ... nobody has said that.
OK, I’m ending this post here—I go on from here to talk about the resurgence of fairy-tales in film and television, but given that this is already quite a long post, I will make that the next installment. Until then …
"My name is Tyrion Lannister, and I approve this message. Now, where's that
wine you promised?"
*For once, YouTube has failed me—otherwise I would gleefully be putting up the clip from The West Wing of Sam Seaborn expounding on this very quotation.
**Yes, I know I’m mixing metaphors here. It seems appropriate.
***Perhaps not literally impossible, but so close as to make no difference. Remind me at a later date to do a lengthy post about the difference between speculative and extrapolative SF.
****Responding to angry fans, George Lucas defended the change as simple clarification—he had always meant for Greedo to shoot first, he claimed, and that the original version made that unclear. To which all I can say is that this is further evidence that Lucas is a hack and had little understanding of the signifiers and tropes he was employing. Unfortunately, he seemed determined to prove this beyond the shadow of a doubt with the three prequels.
I kind of love it, but here’s what’s wrong with it:
It tacitly endorses surveillance culture.
It nakedly panders to our sense of ourselves as, variously, quirky, likable, generous, and virtuous.
All of this behavior is explicitly tied to drinking Coke.
I’m reasonably sure the Coca-Cola company cares little about how quirky, likable, generous, or virtuous I am so long as I buy their product. If it became clear tomorrow that pedophiles and serial killers were the most reliable and numerous consumers of soft drinks, Coke would find a way of pandering to them.
That being said, you have to appreciate the ways in which the ad’s makers push our buttons. Advertising, after all, is the art of making us feel at the expense of making us think. You come out of watching the ad slightly more optimistic about the human race, if not genuinely happy. People are amazing! Look, we have footage! And they drink Coke!
Ads like this always make me very conflicted. On one hand, I love watching all that found footage. Whenever I’m starting to feel really cynical about life, I’ll usually be turned around by some random act of kindness I witness or experience. People are amazing—and here’s the video footage to prove that.
But I’m also always aware of why and how these images are being deployed when I see them in advertising. Not that it makes a big difference for me: I really only ever drink pop these days as a hangover cure, and then I’m hardly brand loyal (“Coke, please.” “Pepsi OK?” “Whatever. Just give it to me NOW.”) So what do I care? Possibly because I resent having my emotions manipulated in the name of branding … but then, with actual TV ads, at least there’s the understanding of what they’re after. I find ads infinitely less annoying than when characters on a TV show I’m watching start expounding on the virtues of the Ford Focus and its onboard GPS.
Plus, there’s the fact that some ads are just good at what they do … even when their manipulation of your emotions is so cynical that it’s quite breathtaking. Exhibit A on this front has to be the following Bell ad:
This ad doesn’t make me tear up so much as it makes me sob. Seriously. It doesn’t make me want to switch my phone service back to Bell, mind you, so I suppose that it’s a failure there. But it does absolutely punch a handful of my personal buttons, given that I am (a) a WWII history buff; (b) so very proud of Canada’s military history; and (c) generally anxious about what happens when we lose that living link to such an important part of our past.
I know everything wrong with the ad, but I love it nevertheless.
I’m curious to hear other peoples’s thoughts. What advertising functions for you as ambivalent pleasures?
I have occasionally threatened to use this blog as a testing-ground for my academic writing ... and sometimes I have actually done so, often inadvertently, but I’ve never tried a sustained series of posts dedicated to a specific topic.
Well, here we go—I’ve been working at my normal glacial pace on a handful of articles on the novels of Terry Pratchett specifically, and fantasy as a genre more generally. For a long time now, they’ve been less a handful of articles in process than snowdrifts of notes dealing with far too many facets of the larger topic, and the process of trying to connect them into cohesive arguments has been not unlike having root canal surgery performed on my frontal lobe. By a bear.
But for all that, some things are coming into focus—the big thing being that I really need to work harder on bringing things into focus. So over the next couple of weeks, I’m going to post some forays into this research, and I’ll try to do it in a more conversational way than I would if writing for an academic journal. Arguments and challenges to my premises are not just welcomed but encouraged …
My name is Christopher Lockett, and I read fantasy. That comes as no great galloping shock to anyone who has read this blog, but I figured I should start with the basics. I read fantasy fiction, and, more than that, it was fantasy that really is to blame for where I am in life now … by which I mean, a professor of English. I read The Lord of the Rings when I was twelve, and it was the first thing I ever read that affected me on a gut level. It took me a few months to get through it—from the end of grade six, over the summer (I remember vividly reading the Battle of Helm’s Deep in the back of the car on the way up to my uncle’s cottage), and into the start of grade seven. I had the full edition of LotR, all three books in the one volume, plus appendices (the appendices are important—I will be returning to Tolkien’s appendices in future posts). Some time in autumn of 1984, I came to the last sentences of the last chapter:
At last they rode over the downs and took the East Road, and then Merry and Pippin rode on to Buckland and already they were singing again as they went. But Sam turned to Bywater, and so came back up the Hill, as day was ending once more. And he went on, and there was yellow light, and fire within; and the evening meal was ready, and he was expected. And Rose drew him in, and set him in his chair, and put little Elanor upon his lap. He drew a deep breath. “Well, I'm back,” he said.
And then, refusing to allow that this novel that had consumed me for several months was finished, I read through the appendices. Tolkien helpfully included a chronology that starts with the first age, and ends with the death of Aragorn some decades after LotR technically ends:
In this year ... came at last the Passing of King Elessar [that's Aragorn, for the non-dorks in the audience]. It is said that the beds of Meriadoc and Peregrin were set beside the bed of the great king. Then Legolas built a ship in Ithilien, and sailed down Anduin and so over Sea; and with him, it is said, went Gimli the Dwarf. And when that ship passed and end was come in Middle-Earth of the Fellowship of the Ring.
THAT was the moment that gutted me—Sam coming home after seeing Frodo et al off to the West was bad enough, but THAT was the moment at which something in my mind said “that’s it—these characters whom you have loved so much? You get nothing more.”
It was devastating. But it was a turning point in my literary and intellectual development, because it was the first time I realized that literature, novels, stories, could have affect. They could, quite literally, change your life. And LotR was the first thing I read that made me want to write. I bought a spiral notebook and pack of papermate pens at the local drugstore (starting my lifelong love affair with stationary) and started writing a story that was a thin knockoff of Tolkien. I also—and this was key—invented my own maps. Maps were important. I loved the maps of Middle-Earth in LotR, and one of my great loves in my readings in fantasy has been the maps imaginary places at the front of the novels—be it Middle-Earth, Pern, Westeros, or Earthsea. (I have always had a cartographic imagination).
But anyway … fantasy in the form of Tolkien (and to a lesser extent, C.S. Lewis) was my defining literary experience. Since then, it has always been a genre to which I have returned. Leaving high school and entering university, it became what I would call a guilty pleasure; sometime around the middle of my PhD, I stopped being guilty about it. (I actually now loathe that term. I’m sympathetic to the desire to appreciate “art,” but at this stage in my life I have known high school dropouts working as carpenters who have a better grasp of Thomas Pynchon than I do, and accomplished, critically acclaimed poets who take inspiration from America’s Next Top Model. It takes all kinds, and one of the things literary study has taught me is that even the crappiest, most formulaic novel can teach us something, even inadvertently, and that talented writers take their inspiration from a host of sources).
So, part of this series of posts I’m working on is a return to my roots, as it were … and to ask: what is the appeal of this genre? And what has it done for us lately? The answers to both of those questions are at once no-brainers and endlessly complex … at its most formulaic and simplistic, fantasy is a nostalgic return to a pre-modern sensibility, which unfortunately tends to include somewhat problematic depictions of race and gender. At its most inventive, however, fantasy represents a remarkable fusion of the historical and romantic imagination.
What first prompted my research into this topic, however, was the gradual realization of how much contemporary fantasy has shifted from Tolkien-esque mythopoeia to eminently humanist narratives. It seems slightly counter-intuitive at first blush: why employ a traditionally anti-rationalist, anti-realist genre to this end? But when one reads George R.R. Martin’s A Song of Ice and Fire, Neil Gaiman’s American Gods,* Terry Pratchett’s Discworld novels, Philip Pullman’s His Dark Materials, Richard K. Morgan’s ongoing A Land Fit for Heroes series (which currently comprises The Steel Remains and The Cold Commands), or Patrick Rothfuss’ The Kingkiller Chronicle (to name a few), one finds a definite shift away from a preoccupation with magic and the supernatural, and the mythical, to one with, well, people … and to frame it more philosophically, with a Foucauldian conception of power not as a transcendent ordering principle, but the product of human exchange and interaction.
To be clear: this is not to suggest that contemporary fantasy has abandoned the mythopoeic tropes on display in Narnia or The Lord of the Rings, or that those novels didn’t have a host of human dramas on display. In the first case, we wouldn’t have a genre we could reliably call “fantasy” in the way we have come to understand it post-Tolkien without some combination magic and sorcery, fantastic creatures, different “kinds” of people (e.g. elves, dwarves, gnomes, etc.), all within the context of an identifiably medieval, pre-modern world.
I realize there’s a lot I’m saying here that is currently ill-defined, and much that devoted readers of fantasy will likely take issue with (as well as devoted readers of critical theory and philosophy—certainly “humanism” is a rather fraught and catholic concept, almost as much as the concept of “fantasy,” and I don’t do myself favors when citing Michel Foucault and humanism in the same breath). To which I just beg patience. As mentioned above, this is all essentially an exercise in forcing myself to clarify my thoughts on the matter(s) at hand and, hopefully, evoking discussion and argument.
Next up: a consideration of diegetic logic in fantasy and fairy-tales. Sexy!
*Yes, including American Gods in this list is a little one-of-these-things-is-not-like-the-others, but I'll be returning to talk about Neil Gaiman quite a lot over these posts ... not because he's a typical practitioner of fantasy fiction, but because his thematic preoccupations help explicate the rest of what I'll be talking about.
It occurred to me that it has been a very, very long time since I've had anything to or show about this blog's original raison d'etre, i.e. life in Newfoundland.
I will attempt to correct that as the summer progresses and we (fingers crossed) have better weather for landscape shots.
Today has been absolutely beautiful, so we went for a hike along the northern part of the East Coast Trail. The following pictures are from the Silver Mine Head Path, going north from the beautiful Middle Cove:
Middle Cove Beach, seen from the trail.
Newfoundland is a hiker's paradise, doubly so because you don't have to travel far at all. Middle Cove is just a fifteen to twenty minute drive from our house. I'll post more pictures when the capelin run.
In the interests of keeping this blog more active, I’m hoping to post more regularly on topics of quasi-academic interest. I have a few posts on contemporary fantasy I’m working up as a sort of Game of Thrones postmortem, as well as some thoughts on the ever-expanding zombie genre.
But I thought I’d start with a film review. I saw Prometheus last night, which counts as my second-most-anticipated summer film (big shock: The Dark Knight Rises takes first place). And wow, it was a train wreck.
And yes: spoilers to follow. Lots and lots of spoilers.
I should know by now not to get my hopes up too high when a new Ridley Scott film comes out. I imagine it must be frustrating for him to have done his best work early on—Alien and Blade Runner, those two mainstays of film classes (I’ve lost count of the number of times I’ve taught them) were his second and third feature films, respectively. Since then by my count he has had one unqualified success (Thelma and Louise), two flawed but powerful films (Black Hawk Down, Gladiator), a number of much more deeply flawed but still watchable films (Black Rain, G.I. Jane, American Gangster), and a host of unmitigated disasters (White Squall, Hannibal, Kingdom of Heaven, Matchstick Men, Robin Hood).* He also directed Legend, which falls into that odd category of being something that isn’t probably nearly as good as we remember it, but we all saw it at the age of eleven, so for some it gets a pass.**
On the whole, not a great track record. But here’s the thing: the great films are so great that they come very close to balancing the sheet, and makes those of us who are devotees of Alien and Blade Runner*** live in hope that his next film will match their aesthetic and narrative brilliance (and, let’s be honest, the sheer coolness).
We keep hoping. And keep getting disappointed. But this time, there was renewed hope: Scott was returning to the scene of his first major triumph, and creating a prequel to Alien! At first, there was skepticism … but then, as teasers and trailers started appearing, there was cautious but ever-growing hope. It looked AMAZING, for one thing. And for another, the more we learned about the premise, the better it looked. Never mind that we could pretty much figure out the story from the trailer: an exploration team follows ancient clues promising to unlock our origins as a species to a distant planet, where they encounter some sort of contagion or infection that will threaten our existence, and in a climactic scene the crashed spaceship they discover in Alien, well, crashes.
Whoever did the marketing for Prometheus should quite possibly have written the script—it probably would have been smarter. All of the trailers and ancillary bits of publicity (such as a TED talk done by Guy Pearce’s megalomaniac CEO Peter Weyland, circa 2023) were greater, in the end, than the sum of Prometheus’ parts. There’s a paper to be written somewhere about how skillful viral marketing frequently seems to eclipse the artistry of the film being advertised … but for now, I’ll limit myself to reviewing Prometheus, which I saw last night in IMAX 3D.
OK, so the positive things first: one, it looked every bit as good as the trailers promised. See, this is the ambivalent thing about Ridley Scott’s crappy films: however crappy the story, they always look amazing. So you always see hints of Alien or Blade Runner in his signature style, which makes brilliant use of light and shadow and the contrast between grandiose, totalizing shots and close, claustrophobic terror or anxiety. And you know that, of only the story wasn’t shite, it could be a brilliant film.
Certainly, this was the case with Prometheus. As I said, I saw it on the IMAX screen, and if you’re going to see this film, go see it in the theatre.
Second positive element was the cast. It was both a mitigating factor, and a total frustration that almost every one of the actors was (a) a consummate professional, and (b) someone I just loved watching. Lisbeth Salander Noomi Rapace, Michael Fassbender, Tom Hardy [correction--the actor I took for Tom Hardy is actually Logan Marshall-Green. Properly contrite am I], Stringer Bell Idris Elba, and Guy Pearce (in a whole lot of prosthetic makeup that qualifies as the film’s worst special effect). The one out of place person was Charlize Theron, who just seemed outclassed by the rest of the cast. She played an ice queen executive in charge of the mission, and was so wooden and toneless that I kept waiting for her to turn out to be an android.
On the other hand, Michael Fassbender played an android, and played it brilliantly. One of the best parts of the film is him alone on board the spacecraft while the rest of the crew slumbers in cryosleep. The subtlety of nascent threat he brings to his emotionless robot is easily on par with Ian Holm’s not-dissimilar character in the original Alien.
But as lovely as Fassbender’s performance is, it is one of the narrative problems: I’ve seen this story before. As prequel to Alien, it anticipates that story; but coming twenty-three years after the original was released, it feels simply derivative. I’m sure there’s some brilliant psychoanalytic reading to be done (“Recursion and Return: The Iterative Narratives of the Alien Franchise”), but really it just sort of feels done. A few elements:
A cold and self-interested corporation desires to exploit a discovery on a distant planet, and is willing to sacrifice its crew.
An alien species is discovered, all dead through some mysterious plague.
A cold and inscrutable android deliberately puts the crew at risk.
A not-so-dead alien species contaminates the crew.
The cold and inscrutable android gets decapitated.
The ship is destroyed to prevent the malevolent life form from getting to Earth.
Sound familiar? It’s understandable to have a certain amount of repetition when producing new films in an established franchise, but really, this just felt lazy.
The premise of the film is this: two archaeologists (Noomi Rapace and Tom Hardy), having discovered the same star system diagram in many ancient texts, convince the head of the massive Weyland Corporation to bankroll an exploratory expedition to the star system. The predictable disasters as partly outlined above ensue. But the larger philosophical/theological question deals with our origins on Earth. The film opens in a primordial landscape atop a massive waterfall. A huge, musclebound albino humanoid watches a huge flying saucer (Ridley Scott likes to work large-scale) depart into the clouds. He (?) then eats a mouthful of grotty-looking stuff, and promptly begins to disintegrate in great pain. The perspective shrinks to the cellular, and we see his DNA breaking apart. He falls into the water, and again, at the cellular level, we see DNA recombine into new strands and then turn into life.
Very obviously, this is the pageantry of new life forming on Earth. Skip ahead millions of years, and our archaeologists make their find. Skip ahead a few more years and the exploratory ship Prometheus arrives at the planet.
"All in the game, baby ... all in the game."
The premise is intriguing and compelling, but not executed well at all. The questions of god and belief, of evolution vs. design, and of parents and children are all handled in terribly hamfisted ways. Eventually it becomes reduced to simplistic paternal drama. Peter Weyland we assumed was dead. As it turns out, he’s on the ship, his presence known only to the android David and Charlize Theron’s robotic-but-not-a-robot expedition leader Vickers. It was made clear earlier that, as Weyland’s creation, David considered Weyland his father, and Weyland considered David his son. In what is perhaps the most unsurprising revelation of the film, Vickers is Weyland’s actual daughter. Weyland has smuggled himself along in the hopes that, in finding the origins of life or meeting its engineers, he might be granted longer life … which, in a particularly hackneyed moment, Vickers complains is against the natural order of succession.
The twist, such as it is, is that our “engineers” were using this planet as a weapons lab—and that they were planning to return to earth and wipe us out (for reasons that are never made clear). The substance they discover in the alien bunker and the underground ship is evidently some sort of agent that mutates DNA in malevolent ways—as Stringer Bell (sorry) observes, it is a biological weapon. And once it gets loose in certain crewmember’s systems, it wreaks its havoc and starts producing monstrosities that only identifiably become the H.R. Giger alien of the earlier films in the final sequence.
It is when the bio-weapon starts making its presence known that the film’s wheels start to come off. Charlize Theron’s wooden acting notwithstanding, the film has a lot of promise early on. Michael Fassbender’s solo sequence in the spaceship in particularly gave me hope … but in hindsight, that was mainly because he didn’t have any lines from the script to speak, and instead was just Michael Fassbender being compelling. The more the script reveals itself, the worse it gets … and Ridley Scott’s reliable standby, his visuals, lose their power once the excesses of the alien/bio-weapon become, well, excessive …
Don’t get me wrong—I like a good gory alien monster flick as much as the next guy (provided the next guy is a total SF nerd), and the Alien franchise has always delivered on that front. Except … well, think of the first Alien. The gore and the horrifying figure of the alien are shocking, but strategically so. The story of how Ridley Scott didn’t warn his actors that he would be spraying them with real blood in the “birth” scene—and so the shock, horror, and disgust on their faces is not feigned—is one of those great bits of film history lore. In Aliens, the gore takes a backseat to the action, but there is enough of it—and the aliens are sufficiently terrifying and repulsive—that we still cringe at key moments.
It’s always difficult to locate that fine line between well-managed abjection and excess, and I suppose it’s always going to be somewhat subjective. But I found Prometheus crossed that line once we got to the scene of Noomi Rapace’s self-administered caesarean (never mind that it isn’t exactly believable that she would sprint away from the surgical pod after having her belly stapled), and then the final confrontation between the engineer and the proto-Giger alien. Perhaps it was the engineer being orally raped—which if it wasn’t obvious enough, the alien then shudders orgasmically and collapses, presumably dead … or perhaps just post-coital, at first it isn’t clear. But … blecch.
"Hm. Yes, I do feel like someone is watching me. Why do you ask?"
Anyway, I’ll leave off with some nerdy nitpicking. First: the basic premise of the film is that the alien species that created us left clues so we could find them. However, the planet to which they send us isn’t their homeworld—it is, as Stringer Bell (sorry again) realizes, a weapons plant remote from their home because, as he observes, you don’t manufacture incredibly dangerous biological weapons in your own backyard. And the film ends with Noomi Rapace leaving with the segmented David in tow on another of the alien ships to find their actual homeworld.
So … why would aliens leave behind directions to their weapons stockpile? Wouldn’t that be like bringing immigrants to the U.S. to Area 51 rather than Ellis Island?
And secondly—and this is the thing I just can’t get past—SO MUCH of this film was obviously designed to cater to diehard Alien fans. Everything about how the film ends sets us up for how Alien begins, except for one utterly baffling discontinuity. In Alien, when the crew of the Nostromo**** enters the alien ship, the first thing they find is the dead alien space jockey strapped into some sort of flight chair. It is long dead, and its abdomen has a hole as if something exploded out of it. After John Hurt’s character has the alien burst out of his belly, we realize in hindsight that the alien space jockey died the same way.
In Prometheus, when David wakes the last remaining “engineer,” the big alien proceeds to kill everyone present except Noomi Rapace, who escapes, and David, who is decapitated but can still talk, and then straps himself into the flight seat to complete his mission, presumably, to wipe out Earth’s population. The alien ship takes off, and Stringer Bell (no apologies) rams the Prometheus into it and causes it to crash … in precisely the position the crew of the Nostromo find it twenty-nine years later.
Now, as I was watching the film—and I’m curious to know if anyone else was thinking the same thing—I kept thinking “OK, so space jockey guy has an alien inside him. When did that happen?” My best guess was that David, for inscrutable reasons, somehow infected him with the bio-weapon … but that seemed far-fetched at best. But then, he leaves his ship to go after Noomi Rapace, and instead ends up tangling with the big-ass mutant alien. At this point I’m thinking, “OK … so he gets impregnated and goes back to his ship to, I don’t know, try to take off again, and THAT’s when the alien comes bursting out.” But no … apparently, he dies when the mutant alien essentially orally rapes him. And then so does the mutant alien. And in the film’s final sequence, the Giger alien (these alien distinctions are getting cumbersome, sorry) bursts out of his gut. FAR AWAY FROM HIS SHIP.
Argh. I can forgive hamfisted storytelling, but not baffling stupidity. It’s as if Scott and his screenwriters (one of whom was Lost’s Damon Lindelof—any comments, Nikki?) didn’t actually bother to re-watch Alien, but just went on vague memory.
At any rate. Bottom line: Prometheus was, to say the least, a massive disappointment … but after so many disappointments from Ridley Scott, I really shouldn’t have expected anything different. That being said, it was a hellishly impressive work of visual art, true at least in that respect to Alien, with many breathtaking moments and subtle grace notes. A shame the story couldn’t live up to that.
--------------------------------
*Then there are the films I haven’t seen, which may include a gem I’ve overlooked. But somehow I doubt it.
**Though perhaps less so now since Tom Cruise detonated his career and retroactively made us see the batshit in all his earlier roles.
***I leave out Thelma and Louise here, because however great a film it was, it does not have the same cachet as the other two. Speaking for myself, I love it, but it inspires none of the fanboyism that Alien does.
****Anyone else wonder why they didn’t give Prometheus a similarly Joseph Conrad themed name? The Narcissus? The Patna? Typhoon?
I'm an English professor. I completed my PhD at Western in September 2004, and was hired at Memorial University of Newfoundland. This blog started out as my way of keeping in touch with people -- giving friends and family the option of checking up on me at their leisure without the annoyance of frequent lengthy mass emails. It still does that, but has also become my personal forum for airing whatever happens to be on my mind. You have been warned.