A humanities scholar's occasional ramblings on literature, science, popular culture, and the academy.

Sunday, May 21, 2017

Ghost in the Wood: Some thoughts on Twin Peaks

People often talk about the work of David Lynch in terms of genre. Much of his oeuvre fits the tradition of neo-noir, featuring postmodern remixes of tropes developed by the likes of Otto Preminger, Edward Dmytryk, and of course, Alfred Hitchcock. Noir elements are certainly present in spades in Twin Peaks, as are elements of soap opera. Twin Peaks isn't subtle in its self-aware embrace of soap opera styles, motifs, and plot twists; it signals this embrace through its frequent references to the show-within-the-show, Invitation to Love. Less discussed in analyses of the show (at least analyses that I've read--I don't profess expertise in Lynch scholarship) are its debts to the haunted house genre.

Twin Peaks is the least-ambiguously supernatural work that Lynch has produced. Whereas in films like Eraserhead and Mulholland Drive many of the weird fiction elements are attributable to characters' dreams or hallucinations, in Twin Peaks it's difficult if not impossible to interpret the demons, whatever they are, as anything other than a literal presence. Those demons' home, the Black Lodge, is the first in a tradition of sites of spiritual significance on TV, a tradition that runs through Buffy the Vampire Slayer's Hellmouth and Lost's island.

Significantly, the black lodge is located in the woods. The woods, and indeed, wood itself, is the show's dominant image. The opening credits sequence provides us not with images of the town itself, but images of the surrounding woods, juxtaposed with images of the sawmill that transforms those woods into industry. The first face that you see in the pilot episode is not that of the dead Laura Palmer, but that of Josie Packard, owner of said sawmill. The last time you see Josie's face, it's in the wood--having died of fear after killing her lover, Thomas Eckhardt, Josie's spirit is absorbed by the wood of the Great Northern Hotel, wood that was likely cut in her own mill. Wood is imbued with an animism throughout the series, most notably embodied by the Log Lady's log. How appropriate that the forest is called Ghostwood.

But if wood can carry spirits, the birds that perch on that wood, like the bird in the opening credits' first shot, can carry demons. The owls are not what they seem.

When I point to Twin Peaks' indebtedness to the haunted house genre, I mean haunted house stories in the broadest sense, encompassing stories of homes and people occupied by evil spirits, be those the spirits of the dead or the spirits of demonic others (in Twin Peaks we encounter both). Modern haunted house films, from The Amityville Horror to Paranormal Activity, take the elements of classic gothic fiction and move them out of the castle and into suburbia, transforming the middle class ideal of home ownership into a nightmare. This is why the genre is so amicable to the Lynchian treatment--many of his films, most notably Blue Velvet, are about finding the nightmares hiding behind the veneer of suburban contentment.

One of the best haunted house films ever made is Tobe Hooper's Poltergeist (1982), in which a real estate developer discovers after a series of horrific events that his new home is built atop a cemetery, and that his boss, when building the neighborhood, only moved the headstones, leaving the bodies. Poltergeist begins with an iconic shot of a cathode ray tube television at the end of the broadcast day, turning to static, which is how the ghosts haunting the property communicate with the people in the home. Ten years later, the prequel film Twin Peaks: Fire Walk with Me (1992) would begin with an image of static on a TV tuned to a dead channel, and that image of static would recur throughout the film.

Looking at the opening of the film alongside the opening of the series, we get a juxtaposition that constitutes the heart of the story: nature vs. modernization, the untouched vs. the developed. People misremember Poltergeist as being a film about a house built on an "old Indian burial ground," but in fact the house is only built on a modern cemetery. In Twin Peaks, however, the area's connection to indigenous populations is unavoidable, represented either through Deputy Hawk's expositional monologues about local beliefs, or through Ben Horne's appropriations of native iconography in the design of the Great Northern (it's telling that Ben's intellectually disabled son always wears a Native American headdress). The Ghostwood National Forest may not be a burial ground, but it's certainly a site of the sacred.

That's why I think it's so interesting that the show's biggest B plot line involved Ben Horne and Catherine Martell fighting over the rights to Ghostwood Estates. I used to think this plot line was pure soap opera, but I've come to see it as essential. Twin Peaks' existing suburbs already boast haunted houses, most notably the Palmer house--what would happen if Ghostwood did become another housing development? What poltergeists would be unleashed?

Perhaps in the past 25 years, Ghostwood Estates actually has been built. If there's one omission from the revival's cast list that has me worried, it's Piper Laurie as Catherine Martell. Without her, I fear that the fate of Ghostwood Estates will be an ignored or underdeveloped component of the show. I hope it won't be. I hope we'll get to see Ghostwood, and Theresa Banks's ring, and Andy and Lucy's now 25-year-old child, who I hope is a daughter named "Judy."

But trying to predict what Lynch will do is probably foolhardy. More than anything else, I just hope to enjoy the ride.

Thursday, March 16, 2017

Let’s Talk About Miscarriage, Baby (Guest post by Paula Matzke)

A note from Brian: This is a post from my wife Paula. I don't have much to add, but we both believe in the value of normalizing these challenges. And, as usual, I'm blown away by her eloquence.

A note from Paula: In the course of trying to make uncomfortable topics less uncomfortable, things inevitably get weird and awkward. If you're not in the mood for weird and awkward (especially if you know Brian and me personally) this may not be the post for you. Take what you need, and leave the rest. 

Content warning: miscarriage (obviously)

Miscarriage. I don’t even like typing that word. Even when you type it, it still sounds like something that should be whispered. In like, an old women’s sewing circle. Followed by a bunch of those sad tongue clucking sounds and a few murmured “Oh, what a shame”s.

Miscarriage. I had one (‘Oh, what a shame!’). I initially did not want to talk about it because miscarriage is weirdly….embarrassing? However, I very quickly realized what a huge number of other women have realized: that our collective not talking about it makes it feel worse, and that we probably should talk about it more.
There are some very powerful things written by other women about their miscarriages. Many of them are written by women who have some degree of distance from it and have accrued some wisdom or, in many cases, a subsequent child. I’ve noticed in particular, that we seem to feel more comfortable hearing about women’s miscarriages when they now have a baby. Ali Wong has a great comedy special in which she discusses her miscarriage. In interviews she says that people didn’t start really laughing at it until she was visibly pregnant with another child. We want there to be a happy ending.

My miscarriage happened last week. There is no baby. There is no wisdom. There is no happy ending.

I 100% do not want to discount accounts of miscarriage from women who went on to have a baby and/or learned something really profound. I have read and heard many of these narratives and found them extremely helpful and comforting and I think we need that. I am so grateful to these women who have written about miscarriage with beauty and wisdom and eloquence, but you know what? Maybe sometimes we don’t need wisdom, beauty and eloquence. Maybe sometimes we need some discomfort and vulgarity and rambling.

I am no stranger to writing about difficult and uncomfortable topics. In the past I have written very candidly about psychosis and psychiatric hospitalization because I also have bipolar disorder. My medical file now emits a palpable cloud of awkward pity when opened. However, there is a certain pride in having ‘stigma’ be a keyword in pretty much all the articles about any part of your medical history. I’ve got double stigma now: I have fucking STIGMATA (I’d make the obvious Jesus joke here, but when you have bipolar disorder you’re not allowed to do that, so you know, make it yourself). Not to make light of other medical conditions, but honest to god, I really hope next time something awful happens to me it’s appendicitis, or pernicious anemia, or some other shit that it is socially acceptable to talk honestly about outside of a social worker led support group or blog post that people will call you “brave” for writing.

So what happened? Well, I was having a perfectly normal pregnancy. This may be surprising to some of you that know me and/or Brian, because it was supposed to be a secret. You are not supposed to tell people before you make it to 12 weeks. This is because most miscarriages happen before 12 weeks and it is very sad and awkward to have told people you are pregnant and then have a miscarriage and have to tell them you are not pregnant. It is also very sad and awkward to NOT have told people you are pregnant and then have a miscarriage and have to tell people not only that you are not currently pregnant, but also that you were pregnant. It is sad and awkward either way and if you are pregnant you should tell people whenever the fuck you want to.

Anyway, I was pregnant and it was perfectly normal and uneventful. I didn’t bleed; I was throwing up, my boobs hurt; these were all good signs. I had no particular reason to worry about miscarriage, but I did anyway because I am an anxious fuck. From the minute I got that positive test, hell even before I got that test, I worried about miscarriage. I knew that about 1 in 4 pregnancies ended in miscarriage, so it wasn’t a totally irrational worry, at least in the beginning. But as the weeks went on and the risk of miscarriage theoretically dropped (I say theoretically because this is an event that has already happened and therefore the probability of its having happened is now 100%), I still worried. This was not because I had some kind of special intuition, it was because I am an anxious fuck and I always assume the worst. My husband, bless his heart, did not worry. This was, at the time, the more rational position. He was full of hope and blissful ignorance and was therefore very reassuring.

The morning of my first pre-natal appointment, my husband took his blissfully ignorant (and very cute) ass to work as normal. I woke up in kind of a funk. I had previously been very excited for this appointment, but I was noticeably less excited that morning. Again, I would like to think it was some kind of “mother’s intuition”, but more likely it was that it was the first day after spring break and I was a tired anxious fuck. I went into the appointment eager to finally have some concrete evidence that the little sprout inside of me was there and ok. It was there. It was not ok.

It’s kind of a cliché to say that you can’t possibly understand what it feels like to, at one second think that you will probably be bringing a baby home in seven months, and the next second know that you won’t because your stupid embryo doesn’t have a fucking heartbeat. So I will try.

You know that feeling where you are expecting reassurance? Like when you text your friend to see if they got home safe, or tell your partner that you love them just so you can hear it back, or hear a car pull into your driveway when you’re expecting someone home and they’re late? It’s like when you get a call back from your friend’s number, but it’s someone else’s voice. It’s like hearing that infinite pause instead of “I love you too.” It’s like when you look out the window and it’s a police car.

It’s that feeling when your wife went off to her pre-natal appointment that morning and you hadn’t even thought about it until you looked at your phone after class and saw the missed calls.

My husband is no longer full of hope and blissful ignorance. It has been squashed right out of him like the last dregs of toothpaste and now for any future pregnancies we can be anxious fucks together.

My body failed at pregnancy, but it also failed at miscarriage. I had what is called a “missed miscarriage”, which is when the embryo or fetus dies, but your body doesn’t recognize it, so you don’t miscarry “naturally” (which is the polite way of saying you don’t bleed and cramp and expel the dead embryo or fetus from your body). In this situation you can either 1) wait for above “natural” miscarriage to happen 2) induce bleeding and cramping and associated dead embryo or fetus expulsion with medication, or 3) have surgery to clear out the dead embryo or fetus from your uterus.

Many women like to let things happen without additional intervention or prefer to have the comfort of miscarrying in their own home. That’s perfectly valid. I, however, did not want to let things happen naturally because nature is a goddamned asshole. Additionally, the prospect of copious amounts of blood and pain did not appeal to me, while a minor surgery that would get it all over with quickly and that involved a legal high did not sound so bad.

A week later I got what is technically called a “manual vacuum aspiration”. This is where they basically take a big syringe and suck out the dead embryo or fetus and associated uterine gunk. It is exactly the same procedure they use for many elective abortions. The difference is that when you wanted the pregnancy you get to do it in the hospital instead of having to go to a whole separate clinic and your insurance covers it. It doesn’t make a lot of sense.

It was painful and awkward, but I got to get high so I didn’t really care that much and I don’t remember much either. I threw up on the car ride home. My husband pulled over and I spewed apple juice onto the dirty snow and thought about how this was the last time I would throw up because of this pregnancy.

Physically, I actually feel much better now, which I feel weird about. I’m no longer nauseas and exhausted. I haven’t bled or cramped that much after the surgery. Emotionally, I feel like a woman who just had a miscarriage. I have spent most of my time on the couch watching re-runs and eating take-out (but I’ve still managed to lose 3 pounds, so there’s that). I have not been back to work yet. This is not only because I want to cry and eat pizza in peace, it is particularly because my job this week was supposed to involve 1) making plans for my future that just a few weeks ago were supposed to have to be rearranged to accommodate a baby, and 2) teaching students a unit about kids dying of cancer. I do not want to do either of these things right now even though I probably physically could. I have also previously been on the student side of the TA having an obvious emotional breakdown in the middle of class situation, and it is not comfortable for anyone. My students deserve better.

Luckily (well not really luckily because, as I have been wisely told, there is no good news when there’s a dead baby involved) miscarriage is one of several “trump cards” in life. Similar to how saying you have diarrhea will get you out of pretty much any social obligation, informing people of your miscarriage is SO awkward and uncomfortable for everyone involved that no one is going to ask follow-up questions or challenge you. They just want to get out of that conversation as soon as possible and they don’t want you around crying and being sad. To be frank, this is one of the main reasons that I told people. I needed that instant pity and I wasn’t above getting it.

When you tell people about your miscarriage they now know, not just that you’ve had a miscarriage, but also some shit about your sex life, which is fun. When you are young and recently married and obviously very sad about your miscarriage people will assume that you have been “trying” (which is polite code for ‘having unprotected sex’) and may continue to do so in the future. “Trying” is also not polite conversation, so now there’s just a whole bunch of unspoken knowledge and assumptions about your and your partner’s reproductive organs out there. For those who are curious, yes Brian and I had been “trying”. Yes we will continue to be “trying” in the future. No, we don’t want to talk about it with you unless you’ve also had a miscarriage and want to be sad about it with us.

That’s it. I hope I’ve at least satisfied your morbid curiosity about miscarriage. And for those of you who didn’t have any morbid curiosity because you already know all too well yourself: I am so, so sorry. I hope my words have at least provided some small comfort to you in the same way other women’s stories have comforted me. Like I said before, I don’t have a happy ending. What I do have is pizza, Sex and the City streaming, and a bottle of wine. So let’s talk.

Saturday, March 11, 2017

On Impatience with "Problematic" Stories

Because I'm perpetually behind in my pop culture consumption, I only recently finished Louis CK's series Horace and Pete (2016) and Denis Villeneuve's film Sicario (2015), both on Hulu. They're both well made, with solid acting, writing, and directing, and I probably would have enjoyed them both quite a lot if I had watched them when they first came out. But, watching them in 2017, I hated them both. In the wake of the election, my tastes changed--specifically, my patience for "problematic" narratives has diminished substantially.

I've often joked about the way in which the word "problematic" gets thrown around sometimes as a euphemism for "bad," but that's not quite accurate. The way that I teach stories (be they novels or films or TV shows or whatever), and the way that I try to talk about stories outside of the classroom too, I try to lead with the assertion that it's important to acknowledge ambiguity. No story has a single interpretation that is "right" to the exclusion of all other interpretations. That's especially important to acknowledge when engaging in an evaluation of the sociopolitical implications of a story. I doubt that there is any story that is purely progressive or purely reactionary. When someone says that a story is "problematic," often what they are communicating is that they acknowledge the story's ambiguity but feel that its implications lean more towards the reactionary than the progressive. Because people so often adopt a defensive crouch when confronted with the reactionary implications of stories that they enjoy, I've always maintained that it's perfectly okay to enjoy problematic stories--we all like some problematic stories--and it's perfectly okay to draw out more progressive interpretations that exist within the story's ambiguities, so long as you don't try to shut down conversation or deny interpretations that may be less comfortable.

I still believe this, but on a purely personal level, it feels like the balance has shifted, and as I said, I just don't have as much patience for some problematic narratives. Take Sicario, a film about U.S. federal agents trying to take down a Mexican drug cartel. The film not only upholds stereotypes about Mexican criminality, but explicitly uses those stereotypes to justify the horrendous authoritarian tactics of its protagonists, including torture, extrajudicial killings, and partnerships with murderers. It's certainly possible to read the film against the grain. Our viewpoint character, an FBI agent played by Emily Blunt, is horrified by the abuses committed by her CIA counterparts, played by Josh Brolin and Benicio del Toro, and in some respects the film can be seen as asking us to share in that horror and condemn these actions. But ultimately, it's hard not to see this film as aligning with Donald Trump's rhetoric, presenting a dystopian view of the Mexican border as a lawless land requiring brutal suppression. It's a stomach-turning worldview.

The problematic worldview of Horace and Pete is much more subtle, but no less troubling. Louis CK and Steve Buscemi play the titular Horace and Pete. Horace is a sad sack divorcee who is estranged from his children, and Pete is a schizophrenic recently released from a mental hospital (the show's offensive depiction of mental illness is a topic for a whole other blog post). The two run an unprofitable dive bar that has been in their family for 100 years, along with their sister Sylvia, who is suffering from cancer, played by Eddie Falco. Much of the series focuses on Sylvia's desire to sell the bar coming into conflict with Horace and Pete's rigid, self-destructive adherence to the traditions established by their physically and emotionally abusive parents and grandparents, the bar's previous owners. Their story is interspersed with conversations from the bar's regulars, mostly about politics. The series was released during the 2016 presidential primaries, so many of these conversations are built around the creation of cynical equivalences between the Republican and Democratic candidates. In a way, the show's depiction of both its title characters and the barflies offers a useful take on the "white working class" whose disaffectedness led to Trump's victory. I can see it serving as a time capsule, providing a glimpse into a very particular moment in the history of white masculinity. The problem is, the show asks us to sympathize with these broken men and their nihilistic, misanthropic patrons as they all refuse to break free from their cycles of self-destruction, taking others down with them.

But I can't offer that sympathy. Not in 2017, when I know where those self-destructive tendencies lead.

Friday, December 30, 2016

What Scares Me

Years ago, when I was still a Ph.D. student, I was teaching a first year writing course. I'd assigned students to write a "position paper," in which students were to take a position on a contemporary issue and argue for that position. Most papers made familiar arguments on topics that 18 year olds are invested in--violent video games don't make people more violent, schools should eliminate standardized tests, etc. But one student's paper--conveniently, the bottom paper of the stack, which I read late into a long evening of grading before returning the papers the following morning--was an argument that opposed covering birth control under the Affordable Care Act. This was during the height of debate over the ACA, and unbeknownst to me before reading this paper, this particular student was a deeply conservative Catholic. His paper was an utter mess--though initially framed as an argument about the need for religious exemptions to birth control coverage, the paper quickly descended into a unfocused rant about the evils of contraception, highly confrontational and lacking any coherent organization. Trying to control my own offense at the paper's tone, I gave the paper a low C (which felt generous), and provided a brief, carefully worded end note about the paper's tonal and organizational issues. A few days later, the student came to my office hours to discuss his grade, though it was unclear whether his intent was to get a better grade, or further litigate his argument. Repeatedly, I would attempt to turn the conversation to the rhetorical and analytical content of the essay, only to get pulled into an argument about contraception itself. I could feel that I was being cast in the role of the stereotypical liberal university teacher; the conversation was an exercise in trolling more than a discussion of academic writing. But the sad thing is, I genuinely sensed that the student didn't realize there was a difference; to this 18-year-old, argument was argument.

I eventually let the student rewrite the paper on a different topic and raised his grade slightly, an outcome motivated by exhaustion rather than a sense of fairness--he had successfully worn me down, and as a grad student it wasn't worth it for me to fight for him to keep his C. This would not be the last time that I would feel like I've failed as an educator to adequately confront a student's ideological entrenchment--the student who, when asked to bring an example reading to class, picked a 10-year-old climate change denying editorial from a right wing magazine; the student who wrote a paper critical of Western medicine whose only sources were anti-vaxxer websites; the student who, in a discussion of violence at Trump rallies, insisted, "to be fair, there's been violence on both sides"--these examples all stick with me for the way they've left me feeling blindsided.

I don't believe teaching can or should be an ideologically neutral act. However, nothing in my training as a teacher of writing and literature has prepared me to handle overt ideological conflict. I've acquired some basic skills on my own, but for the most part, the material that I cover as a humanities teacher presupposes a certain amount of common ground, and I no longer believe we can or should assume that such common ground exists.

Here's what scares me: I know that a year from now, or two years from now, there will be college students citing Milo Yiannopoulos's new book as if it is an academic text, citing statements from the new president as if they are authoritative truths, and many of the people on the front lines in confronting those ideas--the teaching assistants and writing instructors--will be ill equipped to handle it, just as I was and largely still am. We train instructors on how teach evaluating sources, we train instructors on how to teach rhetoric, but we don't train instructors on how to teach ideology. At least not directly--certainly, by training instructors on how to teach sources and rhetoric, we are indirectly training on the teaching of ideology, but I don't believe we can afford to be so indirect in the future. Teachers need the equipment to confront racism, misogyny, homophobia, and science denialism when and where it shows up in their classrooms--to confront it with the compassion, empathy, and respect befitting an educator, but to confront it nonetheless. I have decent improvisational skills, but I need more concrete strategies--we all do.

Of course, resources providing these strategies already exist, but they're more important than ever before. Here are a few questions I've been mulling over, pertaining to specific teaching scenarios that I've encountered in the past that I suspect will become more common in the future:

  • How do you grade and comment on a paper when the fundamental problem with its argument isn't the rhetoric or the sources or anything you covered in class, so much as it is the first premises from which the paper is arguing, e.g., an unexamined assumption about biological race?
  • How do you diffuse an encounter with a confrontational student in office hours?
  • Is there such a thing as a student who is a "lost cause"? That is, how do we identify and deal with students along a spectrum of ideological entrenchment, from those who simply have never examined their problematic beliefs before, to those so steadfast in their bigotry that there is little to nothing productive we can do with them in the course of a semester?
  • What role do facts play in an ideological discussion, and how should an instructor handle a situation where they don't have the relevant facts at hand? If one student peddles false or misleading information in a classroom discussion, but the instructor doesn't have the resources to fact check in the moment, how can the instructor effectively debunk the misinformation while maintaining authority?
  • How do you manage sensitive discussions in an ideologically diverse classroom? That is, in a hypothetical classroom comprised of 2 or 3 students from vulnerable populations (e.g. queer or POC students), 5 or 6 students who consider themselves allies, a dozen or so indifferent or politically disengaged students, 5 or 6 international students without the same background in American political discourse, and 1 or 2 right wing students who might be inclined to be more confrontational, how does an instructor manage conversation so as to provide an education to everyone at once (this is, in my anecdotal experience, a fairly common demographic breakdown of a ~25 person class at Michigan)?

Obviously, no two scenarios will ever be the same, so none of these questions have easy or universally applicable answers, which makes ongoing teacher training that much more important. I honestly don't even know if I'll still be teaching in a year or two, but I'll still be in academia in some capacity or another, and whatever I end up doing, I hope I'll have a network of colleagues with whom to explore these questions. We'll need each other as resources now more than ever.

We also need institutional backing, since those on the front lines are also those with the greatest professional vulnerability: graduate students and non-tenure track faculty. We shouldn't have to risk our jobs confronting ignorance and bigotry, but if we don't confront the ignorance and bigotry, I don't believe it's a job worth having in the first place.

Saturday, June 18, 2016

The Premature Death of a Course: A Glimpse into how Universities Value the People they Pay to Teach

Last Fall I saw a Facebook post from a former student of mine from Shanghai who is now studying at the University of Michigan. In the post, he commented on Ann Arbor's air quality and expressed sadness and concern for his parents living in Beijing, for whom he'd just bought a new air filter. Coincidentally, directly below that post, Facebook showed me this meme, which had been shared by a history professor acquaintance of mine:
The juxtaposition of my former student's post and the professor's post got me thinking about the language of abjection with which we talk about pollution in China, and Beijing specifically, and about how air pollution constitutes a kind of Faustian bargain with the mephistophelean forces of industrialization and modernization. And it got me thinking about the history of that Faustian bargain--how it started in Europe, moved westward across the Atlantic, and now had moved westward across the Pacific.

These thoughts stuck in my brain for a few months, and by January I had come up with an idea for a course that I wanted to teach examining the history of urban air pollution. The course would consist of three units: one on nineteenth century London, one on twentieth century Los Angeles, and one on twenty-first century Beijing. The course would be interdisciplinary; within each unit we would examine scientific questions (chemically speaking, what was the air pollution in each of these cities, where did it come from, how was it measured, and how did it affect human's health and the environment?), humanistic questions (how did artists and activists work to make the air pollution visible and represent its effects?), and sociopolitical questions (how did politicians and regulators address the problem of air pollution?). The course would begin with Oscar Wilde's observation in "The Decay of Lying" that painters were responsible for making London's fog visible:
Where, if not from the Impressionists, do we get those wonderful brown fogs that come creeping down our streets, blurring the gas-lamps and changing the houses into monstrous shadows? To whom, if not to them and their master, do we owe the lovely silver mists that brood over our river, and turn to faint forms of fading grace curved bridge and swaying barge? The extraordinary change that has taken place in the climate of London during the last ten years is entirely due to a particular school of Art.
And it would end with the artist Nut Brother's recent work, creating a brick out of Beijing's air pollution:
I talked to several professors about the course, and they recommended that I propose teaching it for U-M's Program in the Environment (PitE). I wrote up a proposal and in April I met with the program's director, who happened to be a former professor of mine. He loved the idea, and wanted me to teach it as a 300-level PitE course in Winter 2017.

There was just one complication.

As I've documented on this blog, I've been a contingent faculty member at the University of Michigan since I completed my Ph.D. in 2013. My rank is Lecturer 1. U-M has four levels of Lecturer, Lec 1 being the lowest. In my experience over the past three years, the most frustrating thing about being a Lec 1 has been the instability--I'm hired from semester to semester, and never know whether I will be teaching, and if so, what classes I will be assigned, until the last minute. Twice I've been laid off for lack of work: in the Winter 2014 semester I had to take a temp job at the library to pay the bills, and in Winter 2016 I was fired from the English Department and the writing center, only to get a last minute partial appointment at the Honors College.

This past year, fed up with this uncertainty and seeing the hopes of a more stable faculty position rapidly dwindling, I decided to make a transition into academic libraries. I had already applied to and come close to being hired by several academic libraries; I decided to go back to school for a Master of Science in Information (MSI) to increase my chances of securing a good position in that field. This would have the added benefit of giving me something to do in Ann Arbor for the next two years while my wife finished her Ph.D. in Evolutionary Biology. We could then go on the job market together with several career options between the two of us.

I had initially hoped to pay for the MSI degree by applying to become a University Library Associate (ULA), but, as luck would have it, the ULA program was discontinued for the upcoming school year. My backup plan was to pay for the degree by teaching as a Graduate Student Instructor (GSI). And that's where the PitE course comes in.

PitE has offered me a position for Winter 2017, but they have offered it to me at the rank of Lec 1. Now, previously, it had been the instability that frustrated me about work as a Lec 1, but now, from the vantage point of someone who is going back to graduate school, the disparity in financial compensation is coming into sharp relief. It's important to note that at U-M, a Lec 1 and a GSI can do essentially the same work: they can both be hired as a the instructor of record, teaching their own course, or they can be hired to lead discussion sections for a Professor's large lecture class, like a "teaching assistant" (though we don't use the term TA here in any official way). However, according to the standard contracts for a GSI and a Lec 1, the work looks very different in terms of how we are compensated. Let's look at a side-by-side of what a GSI might get for teaching one course vs. what a Lec 1 might get for teaching the same course:

Lec 1
% appointment?
Tuition waiver?
Full tuition waiver, the value of which can vary depending on the school. For the School of Information it is $10,000 per semester.
No compensation equivalent to the tuition waiver.
Approximately $10,000
Approximately $6,000
Health care?
Only for appointments over 50%

Now, I should note that GSIs only teach one course per semester, and because Lecturers usually are hired at a 100% appointment (three courses per semester), they typically do receive health care and their take home pay is substantially higher than graduate students. I should also note that, compared to many universities, these rates of compensation are something U-M should be proud of, and are the result of hard work on the part of the graduate students' union, GEO, and the lecturers' union, LEO. But still, looking at this information side by side, it's striking to note the disparity in how two ranks of university employees are compensated for doing the same work, especially considering that GSIs, almost my definition, are inexperienced instructors without advanced degrees, while Lec 1s are experienced teachers with MFAs or PhDs. I think this disparity speaks rather clearly to the university's priorities, which lie more with cultivating future scholars than with compensating professional teachers. The sad irony is that those future scholars are more likely to end up in a job that looks like a Lec 1 position than one that looks like a tenure track professorship.

I haven't yet decided whether to accept PitE's offer to teach the course in Winter 2017 as a Lec 1 with a 33% appointment, or to see if I can find a GSI position with another department. PhD programs at U-M typically guarantee GSI positions to their students--I had ten semesters of support as a GSI during my PhD program--but the School of Information offers its MSI students no such guarantee, meaning I have to apply to positions one at a time. There are few GSI positions in the School of Information, and positions in other departments are difficult to obtain for applicants outside of the department. For the Fall semester, I applied for over 20 positions in different departments and only last week received an offer to teach discussion sections for a class in the Screen Arts and Cultures Department. It's a gamble, but right now I'm inclined to let the PitE course die a premature death.

The urban air pollution course is a passion project of mine, and such being the case I have been willing to put in the extra work of building the course from scratch. But passion won't pay my tuition, or my rent, or my grocery bills, or my doctor's bills.

Wednesday, April 13, 2016

A Few Quick Thoughts on the Colonial Adventure Narrative

There's a genre descriptor that I often find useful, the "Colonial Adventure Narrative." I use it, generally speaking, to refer to those stories where a colonized land serves as the site of adventure for one or more of the colonizers. It's a capacious genre encompassing a variety of story settings and formulas. The Jungle Book and Tarzan of the Apes are obvious examples; perhaps less obvious are Cowboys-and-Indians Western stories of the nineteenth and twentieth centuries. It's a genre most commonly associated with the Victorian period, but it's as old as colonialism and is still going strong. And while no one story element is necessary or sufficient for a story to fall under the umbrella category of the colonial adventure narrative, the genre is associated with a fair share of negative tropes, such as the noble savage and white man's burden. Generally speaking, the colonial adventure narrative is deeply entrenched in the perspective of the colonizers' culture, envisioning the colonized land as a site of unique opportunities and unique dangers. These dangers and opportunities are seen as coming from nature itself, and unlike the "civilized" culture of the colonizers, the colonized people are typically figured as a part of that natural landscape. It's easy to see how these narratives generate and reinforce a whole host of dehumanizing stereotypes, but it's also easy to see how that dehumanization is somewhat complicated. When writers like Rudyard Kipling depict indigenous people as having special skills or a closer relationship with nature, they often do so with earnest admiration, and on a practical level, these were traits to admire, even as their framing perpetuated untruths and reinforced the colonists' power over the colonized.

Now, I'm very far from being an expert on colonialism, and there's a whole host of historical and literary scholarship on the subject with which I have only a passing familiarity. What I am an expert on is science fiction, and science fiction (indeed, all pulp genres in one way or another) owes a lot to the colonial adventure narrative. To the genre's credit, its anti-colonial roots run as deep as its colonialism; both Mary Shelly and H.G. Wells, often called the genre's mother and father, were staunch critics of colonialism whose fiction frequently employed the conventions of the colonial adventure narrative in ways that challenged dominant ideologies. But many works in the genre, from Edgar Rice Burroughs's "Barsoom" series (which in many ways is Tarzan on Mars) to Dune (in which the son of a duke becomes a mighty whitey and leads an indigenous people in their fight against an empire) to even Star Trek (which Gene Roddenberry initially described as a "wagon train to the stars") evince classic patterns of the colonialist mentality.

Colonialism runs deep in pop culture's DNA. It's leapt from pulp magazines to radio dramas to comic books to TV to movies, and it's no wonder why--these stories are tremendously fun! Colonialism's legacy of racism can't be denied, but neither can the excitement that comes with these stories of exotic exploration and cultural contact.

In the past 30 years or so, storytellers have explored a few different options for how to resolve this tension. One option has been to follow in the tradition of H.G. Wells and try to subvert the genre. That, I would argue, is what George R.R. Martin attempts to do with the colonialist undertones of traditional high fantasy, exposing narratives about chivalry and white saviorhood to be a lie.
Whether that subversion succeeds is debatable.
Another option has been to simply ignore the old stories altogether. Despite the seemingly endless adaptations and remakes Hollywood provides us with, I think there's a reason we haven't seen an Alan Quatermain on the big screen in 30 years.
This doesn't count.
Yet another option is to get postmodern--to show a self-awareness about the genre traditions you are engaging with that creates some distance, even if you are not explicitly critiquing those traditions. That's what Steven Spielberg attempts to do with Indiana Jones and the Temple of Doom and what Alan Moore does with The League of Extraordinary Gentlemen.
No, not that one.
A fourth option is to change characters' races--when adapting a story, make one or more of the white characters people of color, or when telling an original story that fits within the framework of the colonial adventure narrative, make the typically white hero a character of color.
Quentin Tarantino kind of tries to do all of the above.
Of course, a fifth option is "none of the above." Which brings me to the trailer for Doctor Strange:
This trailer looks undeniably badass. It also looks sadly familiar. Ever since Marvel released Iron Man in 2008, it's been at the vanguard of the trend embracing more sophisticated and spectacular superhero movies, and, consequently, it's been the leader of the pop culture landscape more generally. But their success has been built on two sensibilities that, I think, are coming into conflict here: (1) a willingness to tell new stories unlike what superhero films and TV shows have done in the past (like Captain America: Winter Soldier and A.K.A. Jessica Jones) and (2) a faithfulness to the pulp attributes of comic book storytelling that previous filmmakers have eschewed (Christopher Nolan) or treated as pure camp (Joel Schumacher). Guardians of the Galaxy's earnest embrace of comic-bookiness is what made it possibly the best Marvel movie, but I fear that that same embrace might be this movie's biggest limitation.

The comic books of the 1930s-1970s were inheriting the sensibilities of the pulp magazines of the 1890s-1950s, which themselves inherited the colonialist sensibilities of the 19th century. That means that a lot of the classic comic book narratives predate the various modes of interrogating the colonial adventure narrative that I described above. Doctor Strange's acquisition of mystical powers from the orient, first told in comic books in the 1960s, is a classic example.

I've heard others criticize Doctor Strange for "whitewashing." It is true that Tilda Swinton's character, "The Ancient One," is typically portrayed as Asian. However, it is also true that Baron Mordo, a typically white character, is played by Chiwetel Ejiofor--as they had previously shown by casting black actors to play Nick Fury and Heimdall, Marvel is not afraid to cast black actors in substantial supporting roles that were originally written as white. It's also true that casting Tilda Swinton to play a role typically portrayed as a male represents an increase in female representation as well.

Whitewashing is part of the issue here. It's worth noting we've seen numerous traditionally Asian characters being played by white actors in recent years, including Liam Neeson as Ras Al Ghul in Batman Begins, Ben Kingsley as the fake Mandarin in Marvel's Iron Man 3, and Benedict Cumberbatch himself as Khan in Star Trek. Thinking about the representational politics of their casting choices would have probably been the easiest way Marvel could have interrogated the genre traditions it was entering into. I was a big fan of the idea of casting Chilean actor Pedro Pascal as Doctor Strange back when the movie had first been announced. This film seemed like the perfect opportunity to give its protagonist a race lift. Instead, it will be the 13th film in the Marvel Cinematic Universe with a white protagonist.

But the problem isn't simply one of whitewashing--there are patterns and structures at play. Notice how two of the three movies I listed above, Batman Begins and Iron Man 3, feature a white villain (Liam Neeson's Ras in Batman Begins, Guy Pearce's Aldrich Killian in Iron Man 3) hiding behind Asian personae (Ken Watanabe as the fake Ras in Batman Begins, Kingsley as the fake Mandaran in Iron Man 3). In both cases the simple storytelling structure--uncovering the villain behind the villain--serves to affirm white supremacy. And notice the similarity between the Doctor Strange trailer and Batman Begins, both figuring the orient as the place where the white protagonist gains special knowledge. These colonialist tropes make some sense in a 1960s comic book, or an 1860s novel, but they're out of place in a 21st century film.

Representation is important, but I would argue that there is a deeper issue here. Marvel's commitment to remaining faithful to its source material is causing it to abdicate any interrogation of the centuries-old mentalities embedded in the genres that it is drawing on.

Sunday, July 26, 2015

Go Set a Watchman, 2015

I don't read a lot of contemporary fiction, and I've only ever purchased two novels on the day they were released. The first was Harry Potter and the Deathly Hallows, the second was Go Set A Watchman. I can't say that there was a deep emotional motivation that led me to rush out and buy Harper Lee's book--I found To Kill A Mockingbird beautiful and moving when I read it in my early adolescence, but no more so than most people do, and I can't say that it's one of the books that particularly influenced me in my intellectual development. But I recognized the sequel's publication for the rare event that it was, and wanted to participate in it.

It's hard to engage with the book without wrestling with the extratextual question, why now? Coming so soon after the death of Alice Lee, Harper Lee's sister and literary executor, the discovery of the long lost manuscript, and Lee's decision to publish it after 55 years of adamant refusal to publish another work, seemed odd. An investigation by the state of Alabama found that claims of elder abuse were unfounded, but it was difficult to shake the suspicion that a frail 89 year old woman was somehow being manipulated or coerced. It was a strange situation; if the manuscript had been found after Lee's death, I would have been unambiguously in favor of its publication, just as I believe Max Brod made the right decision in publishing Franz Kafka's stories, and Dmitri Nabokov made the right decision in publishing The Original of Laura. I don't believe that artists are entitled to dictate their posthumous legacies that way. But of course, Harper Lee is still alive.

Ultimately, I decided to defer to the state of Alabama and others who have maintained that it was Lee's decision to publish the book. Even if her faculties are failing, I believe a person has the right to change their mind, and in this case I can imagine why Lee might have done so. In the wake of her sister's death, it makes some sense for Lee to contemplate her own legacy and perhaps to want to extend it. The publication might even serve as a distraction from her own mourning.

Whatever her reasons, I'm grateful that she published it. I read the first 180 pages of the novel the night it came out; work commitments then got in the way, and it took me another week to get to the last 100 pages. I've been ruminating on it since then. Its an uneven but fascinating novel. Sometimes its clear why the publisher rejected it and encouraged her to write the story of Atticus defending Tom Robinson instead. The novel can be slow, directionless, overly sentimental, and clumsy. But it has moments of brilliance and beauty that match if not exceed my memories of To Kill A Mockingbird. I agree with Randall Kennedy's assessment, "Go Set a Watchman demands that its readers abandon the immature sentimentality ingrained by middle school lessons about the nobility of the white savior and the mesmerizing performance of Gregory Peck in the film adaptation of To Kill a Mockingbird."

As Kennedy notes, the first hundred pages or so seem overly nostalgic and mostly directionless. Overall, the book's greatest weakness is its lack of a plot, something Lee seems to tacitly admit when Atticus's brother, Dr. Jack Finch, tells Jean Louise, "The novel must tell a story." But that meandering quality helps Lee offer glimpses into the philosophical mindset of 1950s southerners, providing insights into the pathological mentalities that link veneration for tradition with the perpetuation of racial inequality.

What little story there is is simple enough: Jean Louise returns to Maycomb for a visit after living in New York, and finds it a more segregated and hateful place than she had remembered from her childhood. Whether the town actually is more racist than it was is ambiguous--the characters attribute the rise in racist fervor to the Supreme Court's decision in Brown v. Board of Education, but what the decision actually did to the mentality of white southerners remains an open question in the novel. Did the Supreme Court and the NAACP spark a reaction that made progress more difficult? Did the decision simply turn the volume up on racism that had existed more quietly in previous decades? Or was that racism always there, right on the surface, regardless of the Court, and had only been invisible to Scout thanks to her youth and white privilege? The answer may be clear to anyone with a sensible understanding of American history, but given how contemporary these debates remain, I find tremendous value in how Lee unpacks the psychology of white supremacy.

At times, the novel does feel tremendously contemporary, so much so that it's hard to believe the novel was written in the time it was set, rather than decades later. It often reads like a work of historical fiction using the past to comment on the present, the way that To Kill A Mockingbird does. In one scene, for example, Jean Louise walks in on a women's meeting hosted by her Aunt Alexandra. In a discussion of miscegenation, Jean Louise's response seems straight out of recent debates over gay marriage (and satires thereof): "When white people holler about mongrelizin', isn't that something of a reflection on ourselves as a race? The message I get from it is that if it were lawful, there'd be a wholesale rush to marry Negroes. If I were a scholar, which I ain't, I would say that kind of talk has a deep psychological significance that's not particularly flattering to the one who talks it."

The fact that these debates feel so familiar--in discussions of race and other social justice movements--might provide another answer to the question, why now? The fact that Atticus is revealed to harbor racist attitudes has garnered the most controversy since the book's publication. But maybe now is the best time to interrogate our white saviors, and, more importantly, interrogate our own veneration of them. Why, exactly, are we surprised that Atticus Finch would join a Citizens Council to oppose the NAACP? What does that reveal about what we chose to see and not see about others? What does that reveal about what we chose to know, and what we chose to assume?

This might not be why Lee decided to publish the book in 2015, but I do believe it's why it's a book of 2015--a book for the era of Trayvon Martin, Eric Garner, Michael Brown, Tamir Rice, Freddie Gray, Sandra Bland, and so many others. It's a book for an America full of white progressives who elected the nation's first black president and actually patted themselves on the back for achieving a post-racial America while confederate flags still flew over statehouses, not even noticing that they were there until nine people were murdered in a church. We (and I'm including myself among those white progressives here) are all Jean Louise. We naively remember our homes and our friends and our families as being better in the past, but then some progress is made and we see them lash out violently to protect institutional racism. And we act surprised, thinking this isn't the Maycomb or the Atticus or the America that we remember. But the hate was always there; it was just able to remain hidden from us because we weren't its victims, and because at the moment it was going more or less unchallenged.

And it's in us too. Our own nostalgia for the simple days when we didn't feel the responsibility to confront the bigotry only fuels it. We try to put responsibility on older people, attributing reactionary attitudes to generational change. But that's a naive comfort. The most haunting moment in Go Set A Watchman comes towards the end, when when Jean Louise confronts her father, and he presses her to tell him what her first reaction to the Brown v. Board of Education decision was, and she admits, "I was furious." Atticus's racism is sad, but Scout's racism is a tragedy.