Listening to the "Brompton's Jingle," a song by the apparently obscure and possibly defunct band "Brompton's Cocktail." A sure sign that I've made a hilarious mistake.
Well, I've made a few of those this weekend, but the matter at hand is my latest essay for Nonfiction, a heaping helping of postmodernism I cobbled together when writing a straight narrative proved too difficult.
(The usual problem with my writing: I can describe people, but for some reason I can't get them to move. One of my classmates suggested that I use more action verbs; presumably she was being condescending.)
My guess is that Prof. McNulty will use it for one of his object lessons. I wrote about Manney, and my essay probably exemplifies in its failure something he's been trying to teach us all term: put yourself in the background. The very fact that I can't tell how good this narrative is tells me that I don't have the proper distance.
But this weekend wasn't all essays and broken keyboards.
I went to the MAPH Halloween party on Saturday, in a somewhat disappointing "Gray Lady" costume. I had been hoping to break my streak of spending costume parties explaining what I am (my costumes have ranged in esotericness from D.B. Cooper to the more obvious Little Boy) but my unwillingness to wear a skirt or dress, a lack of costume funds, low lighting that made my elaborate makeup invisible, and a shocking number of people unaware of the nickname all combined to dash this hope.
It was a good time, nevertheless. My preceptor had told me that these Halloween parties tend to be full of clever, somewhat esoteric costumes, and I was not disappointed. People were dressed as the avian flu (with flu haikus), the plastic bag from American Beauty, and Hunter S. Thompson, and of course there was more than enough goodnatured irreligiosity.
(I'm sorry if I find the very thought of the pope at a kegger hilarious. It's that whole smug atheist thing.)
I have class with both of these girls and they're also among the maybe ten people I end up talking to at social hour, so I think I can reasonably post a picture of them and their respective boyfriends. I'd also like the chance to point out that the injured guy actually needs to wear that cast, which somehow made his costume seem all the more clever.
No pictures of me though, I'm too embarrassed.
All told I took forty or fifty pictures with my $300 defense mechanism, which felt creepy because I've only been here for a month or two and don't really know these people. Some of them seemed to agree — I got quite a few deer-in-headlights looks and suspicious glances — but most of them seemed to be fine with it, and I uploaded the pictures to flickr as an expression of goodwill.
If my experience at Lawrence is any guide, picture-taking will eventually come naturally if I keep at it, but I don't want to start taking so many pictures on a regular basis. Senior year at Lawrence I started leaving my camera behind for some parties, and if memory serves, I had a little more fun when I started interacting rather instead of observing.
Granted, I don't remember much about those parties without photos to help me, but it's a trade-off I needed to make every now and then.
Halloween, yes. I had a lot of short conversations, especially at the beginning of the night (I foolishly arrived only half an hour late, when the place was still deserted — an hour and a half seems to be "fashionably late" here), but I remember having an interesting discussion or two with the Strategist, the girl from my precept who I'm always mixing up with the Medievalist even though they look nothing alike.
I totally called the Strategist on trying to get this one guy in the program to be her gbf and we commiserated on the general failure of gaydar this term. I used to take such pride in mine, too, but I guess gay people in Appleton (e.g. Roy the Effeminate Heterosexual) were just easier to spot.
And I guess I talked to other people or something? I remember thinking it was pretty late and leaving, and realizing on bikeride home that I must be drunk, though I couldn't remember being that way at the party.
Found two stale Oreos in my pocket the next morning. Good times.
full of rage. cut and pasting qwerty letters is slow.
new keyboard soon.
[Update:] Ok, I can only be that stupid for so long. I've set up hot keys as a stopgap, but while this is much faster, I can't exactly write essays like this, and that's exactly what's on my schedule this weekend. I'm thinking... wireless?
Or, god help us, I can write the old-fashioned way, on paper? then again, I don't know any writer who's not a "process" person...
Right now in "Writing Biography" we're reading The Silent Woman, a sort of metabiography about the fight over the details of Sylvia Plath's life, especially her final year.
Near the end of the class period we got into a prolonged argument about who has the right to those details. I was of the admittedly radical opinion that any public expression is itself a form of autobiography, that what we write is a function of how we want to be perceived (is it obvious I was thinking about bloggers?) and that, therefore, biographers have the right to offer a counternarrative to our self-constructed narratives, which are often little more than hagiography.
This view was not shared by many in the class. A more rational version of this, defending the public's interest in useful information about voluntarily public figures, seemed to be the consensus. A few people, on the other hand, were concerned about the effect of this prying "public interest" on those whose fame is involuntary, like the Bush twins for example, and thought it was unfair that Ted Hughes had to spend the rest of his life attempting to defend his image because he'd cheated on Sylvia Plath.
The argument was not resolved, as far as I can tell.
What the people defending Ted Hughes didn't seem to get was that to a certain extent we determine what we hear. He didn't have to read the Plath biographies, or write to newspapers about her, or become the executor of her estate. He wanted to control the message, and that meant participating in a conversation he didn't like.
Six months ago my friend Manney committed suicide, and now someone has put his name on an online suicide wall. I understand that I can't control what's being said about him, but like Hughes (who famously burnt one of Plath's journals) I'd like to try and shape the conversation.
The trouble is that I've had quite a bit of difficulty talking about this. Our Bold Hero can jabber on all he likes, but I'm a very private person. I've always felt that the less people I talk to about something, the more important it must be to me.
For most of the first week I didn't even want to mention Manney's death to the Lawrentians, and when, knowing they would find out eventually anyways, I decided to say something, I wrote the simplest, most basic thing I could.
I doubted my abilities as a writer; I couldn't be sure that anything longer wouldn't just obscure what I thought was important.
Part of it, too, was that I didn't feel I could speak about Manney in the same way that some people could. I was one of good-time people, only around Manney in group situations, for the most part. I respected Manney, put a lot of weight on his opinions, and as strange as it sounds that made him intimidating to hang out with on a one-on-one basis, as I sometimes did during that last summer we were in B-town together. And so I didn't know him like some of the Hamlinites did.
Graham lived with Manney, they were hetero-life-buddies. As I've said before, his "Eulogy for Manney Anderson" says more about Manney than I ever could.
Through some fluke of blogspot, some of the posts at Manney's old blog are also online. He had an amazing writing voice; I was always wishing he'd blog more.
And I have my own stories about Manney, most of which you can't have.
Went to Wicker Park on Friday with the Medievalist, her boyfriend, a few of their friends, and another guy from the program.
(I'm not made of apt nicknames, people. Some people in my life must languish in internet anonymity longer than others.)
After hearing Celine, or maybe it was Jinx, go on about all the hipsters there, I was a bit disappointed. It's possible that I just can't spot hipsters, that, incredibly, what was hipsterish three years ago when I was paying more attention is no longer so and I was elbow deep in hipsters all night — but I'll trust my instincts.
I'd like to see it in the daytime before passing final judgment, but the neighborhood seemed cool and relatively unpretentious. The first bar we went to, the Map Room, had a dozen imported beers on tap and a suspiciously inaccurate painting of Africa on the wall. It shook my faith in tectonic plate theory. There was ample cigarette smoke and that nice background buzz from other conversations.
A good bar. It did me in though; I guess the beers from social hour hadn't yet worn off. Or I'm just a hopeless lightweight. Or a wannabe beer snob: You don't usually find Spaten Oktoberfest on tap. One of the Medievalist's friends had studied in Freiburg too but never had a colaweizen.
Everywhere we went seemed to have those tiny little bar tables with the wobbly chairs.
We had some Mexican food and I remember that the main amusement at the last and final bar was sticking my finger in lukewarm wax. I wasn't the only one, to be fair; it seemed really interesting at the time. That place had crazy music: the first time I've heard my once-beloved Eels at a bar.
There were plans made to hang out some other time outside of social hour, which sounds good. But I was tapped as far as this weekend went.
I spent Saturday and some of today researching for my master's thesis. Anyone interested in blogging and narrative can stay up to date on that at my research blog, which is doubling as my research notebook. The rest of you I won't trouble with it.
There's always a worst day. This term I'm still trying to decide between Wednesday and Thursday.
I invariably stay up too late on Wednesday night doing something stupid — like trying to finish a few five-part Hellrazer plotlines or staging my own Law and Order marathon — and end up exhausted on Thursday. I hate getting up in the triple digits of the morning. Thursday mornings might suck.
I delivered the Chronicle on five hours of sleep this morning, which might explain why an entire bundle of papers disappeared though it still doesn't explain how. Delivering the papers was supposed to take an hour; it takes me a little over an hour and a half, but the money's very good. I got my last month's paycheck today and I feel like a millionaire.
The evenings are the worst part of Wednesdays. I'm at school from four until nine, which means a late dinner. And this Wednesday in creative nonfiction we read both of my recent attempts.
Needless to say, since I posted my bit about The Simpsons here last week, I'd already had some idea what I was going to hear. Noble Joshua suggested that I write something about the difference between an essay and an article, because for me that seemed to be the problem. I'd written essays, I thought.
I shouldn't have made such a foolish distinction, of course. I guess with all the academic writing I'd been doing, before my two writing workshops kicked it into high gear, I was fixated on the academic essay. One girl — I'm the only guy in that class, for whatever reason — wrote a stunning little personal essay about her mom cooking latkes.
You might see my essay on "my favorite meal" here someday, if I get around to revising it. Monday's assignment is "the best present I've received" and I'm going to try to make that one shine. I didn't even think to write like the personal essays I enjoy reading, last time.
The tail end of class was odd. We'd talked about one story by everyone in the class, and there's not enough time left to discuss more than a few stories, but the prof turned to my essay on the Simpsons to teach another object lesson. People were afraid to criticize anything, which is pretty typical of these workshops in my experience, but there were a lot of good points.
Then my prof, who's also an editor for the Chicago Tribune, points out an "error" in the following sentence:
While fans now complain about the show's reliance on celebrities — as of 2003, over 340 guest voices had appeared on the show, a Guinness world record — few object to the seamlessly integrated cameos of these early years.
Granted, this could be improved, but do you see the grammatical mistake? It just jumps out at you, doesn't it?
It shouldn't be "over 340 guest voices," he says. It should be "more than 340 guest voices." The preposition "over" expresses a spatial relationship.
Flabbergasted. One beat.
"I ate over 300 sandwiches..." I'm thinking out loud.
Perhaps it was the influence of so many Language Log posts on prescriptivism, but when this use of "over" continued to sound natural I didn't doubt myself. I told this veteran journalist flat out that his rule sounded like a pretty arbitrary distinction to me and I wondered aloud if he hadn't "been smoking a bit too much Elements of Style."
Judging by the collective sound of indrawn breath that was a bold statement to make.
To his credit, Prof. McNulty (who does indeed recommend that we all buy Elements of Style) took my criticism gracefully and suggested I look this rule up, do some research on the topic. I didn't really need that kind of encouragement, but when I got home I found just what you'd expect: widespread use of this sense of "over" in newspapers and on the Internet at large, and a bunch of webpages full of prescriptivist nonsense, singling this out as an error.
"Over" has many standard uses, including "more than in degree, quantity, or extent," a sense that goes back at least a thousand years according to some sources. But prescriptivism is a bit of a religion, and for some reason a few of the prophets have decided this sense of "over" is an error. (Strunk and White themselves are silent on the matter, for the record.)
It really bugs me that, if I want to get a job copy-editing at some point, I'll probably have to play by these arbitrary rules so that people much like this simpleton don't think I'm stupid. I'm already regretting the pointed use of "which" instead of "that" in one application essay.
They amuse me, to be honest. For instance I love your use of "gammatical."
There's a difference between casual conversations (IM, email, etc) between friends and work meant for public consumption. Only uncorrected mistakes in the later frustrate me: I'm not wound up that tight.
As a society, we're writing more. The internet provides a soapbox for our beliefs and we use it. Grammar and spelling in this medium, though useful, aren't necessary for someone to participate in the conversation. If more people are writing and expressing their oppinion, more power to them.
It's the kind of article that makes me want to start copy-editing again. I guess eight copy editors aren't quite enough manpower to, say, spell Tracey Ullman correctly, or spot a blatant malapropism like "into territory once chartered" in the freaking lede.
I really can't remember why I didn't go to the Maroon's orientation meeting, presumably there was some sort of schedule conflict. Why I didn't go over to their offices later is more complicated: I haven't found any of the articles that interesting, I assumed that they were already stuffed to the gills with copy-editors, and I wasn't sure I'd have enough time to help out between school and work. Probably other reasons too.
I should really be copy-editing something, though. It's such a good feeling, spotting and correcting errors, and so frustrating, as a reader, knowing there's a mistake I could've fixed. Worth some investigation.
The article itself, I should mention in case anyone reads it, lumps together the later seasons and makes the blanket statement that the show is in continuing decline. There was so little description of the later seasons, however, that I wonder if the author has even been watching. Needless to say I disagree with the assessment that since season 10 or so the show has gotten progressively worse.
Since I seem to have expended all my blogging energy, and no small amount of studying energy, composing my latest comment on the relative uselessness of theater, this entry will be short.
I made cookies. Imperial cookies to be exact; they're like vanilla wafers except instead of vanilla you use nutmeg. Here's a close-up of the final product:
I got the recipe from this page at Bartleby.com but only realized how annoying said recipe was once I'd already mixed the batter.
Here's a sane version:
Imperial Cookies
Mix until creamy, slowly adding in the sugar: 1/2 cup (1 stick) butter 1 cup sugar
Add to the above mixture: 2 eggs, already beaten 1 tablespoon milk 1 teaspoon lemon juice
Sift and combine with the other ingredients: 2 1/2 cups flour 2 teaspoons baking powder 1/2 teaspoon nutmeg
Either roll little balls of batter and press them flat between your hands, or if you're really fancy, use a rolling pin and a lightly floured cutting board. Cook at "moderate" heat, which for those of us with temperatures on our ovens appears to be around 350-375 degrees, and though they just guessed in the old days, you can leave the cookies in the oven for about 12 to 15 minutes. Makes a little over two dozen.
No rush, really: they don't seem to burn easily, they just get a little harder if they're in the oven too long, and it's hard to mess up a cookie so loaded with sugar and butter. Naturally this was my dinner, served fresh and brilliantly paired with a bologna sandwich on wheat bread with whip � la miracle.
So I attended the fourth installment of the MAPH Roundtable Reading Series this Friday, after skipping the last one because the play they were reading was written by David Mamet, a.k.a. the director of Spartan. This was only the second time I've gone: I also skipped the first week after deciding that the play (something about a baby that may or may not exist) sounded excessively postmodern.
Yes I know I'm out of touch. So most of the plays I like were written over 300 years ago. Is that a crime?
Not that I'm a very convincing partisan for Restoration Comedy when I can't remember any of the playwrights. That was a bit embarrassing. Next time I'll just be that guy who chatters on about Shakespeare.
As usual, I wasn't planning to go this week, but by the end of social hour I'd convinced myself. This week's play — David Auburn's "Skyscraper" — sounded decent enough, the host was providing drinks and appetizers, and I had nothing better to do. Their dubious taste aside, students who sit around drinking wine and reading a play on a Friday night sound like my kind of people.
The play was pretty good — much better than Bruce Norris' "The Infidel," which we read last time I went — but ultimately I just can't bring myself to care. I enjoy the evening's literary theme, the intellectual bent of the conversation (just like watching a movie at Adam's: there's always a discussion afterwards), and the free snacks don't hurt. There were even Cheez-Its.
I'd just rather spend an evening watching a movie than playing at high culture. You'd have to be a real snob these days not to admit that movies can do everything plays can do, and they often do it better.
In fact, since Hollywood (and to a lesser extent television) have so consistently co-opted willing dramatists (Auburn's Proof is in theaters now) I submit to you that Joseph Epstein's thoughts on poets apply equally well to modern playwrights:
I happen to think that we haven't had a major poet writing in English since perhaps the death of W.H. Auden or, to lower the bar a little, Philip Larkin. But new names are put forth nevertheless — high among them in recent years has been that of Seamus Heaney — because, after all, what kind of a time could we be living in if we didn't have a major poet? And besides there are all those prizes that, year after year, must be given out, even if so many of the recipients don't seem quite worthy of them.
This also applies (especially) to retired playwright Harold Pinter, who I'll dismiss without having seen any of his work. If his talents as a playwright don't transfer over to screenwriting then he's of no use to me.
Of course, I'm not likely to stop going to the roundtable sessions, which after all are my only regular interaction with other students outside of Jimmy's and class &mdash I just wish the subject matter was a bit more to my liking. With the time allotted, modern short one- or two-act plays are the best objects for our attention, but they're also the least enjoyable part of the whole evening.
Ok. I'm a lurker at Graham's site, and have been pretty consistently impressed by the quality of your engagement with him (and others), Dan.
But this was dispiriting.
Pinter made quite a living--and quite a few damn good films--as a screenwriter. And even the best of these (say, "The Comfort of Strangers") pales in comparison to his astonishing and challenging work on stage.
The play about the baby -- was it "The Play About the Baby"? Are you just dismissing Edward Albee out of hand, too?
Here's something: I love film. But "Angels in America," while an outstanding bit of filmmaking, cannot, just can NOT do what Kushner's play did. It's not snobbery; it's attention to how texts work. There is a distinction in genres that plays out substantively, thematically, and (yes) entertainingly -- theater has real bodies in real time, so when there are scenes occurring simultaneously on stage, there is a tension (and collaboration) between actions that film (with its cross-cutting) cannot accomplish. When an angel descends over a body on stage, we see the wires, while film makes it "real." There's something about theatricality that great playwrights explore
(That said, no play can ever get the dizzying scale of "Lawrence of Arabia," the full-blown visual pyrotechnics of "Lola Rennt". It's not either/or--or collapsing into sameness. They're different media; what can each medium do particularly well?)
I obviously care more about this than about rehashing debates about conservative vs. liberal. Sorry to jump right in, unannounced and uninvited, but... I mean it's Albee! He's fantastically funny and invigorating....
Besides being the obvious "The Play about the Baby," there's also "Buried Child", "Agnes of God" and "Who's Afraid of Virginia Woolf?" Relatively common thing, plays about real or imagined babies.
This is the problem theater has: seeing or reading a play has come to be identified as something high-class and snobbish, and most people would just rather see a movie because they're cheaper. Besides, no one goes to see a play just to see a play; there's an inherent intellectualism behind it. "I'm going to see this play, I'm going to LEARN something, I'm going to be intelligent." Films will be on this same level when graduate students start meeting every Friday night to read screenplays and drink wine.
"Angels" is a good example. See the play sometime then watch the TELEVISION EVENT. Decide for yourself, then.
I'm willing to concede that I was deliberately trying to provoke comment by dismissing Albee and Pinter so casually. While I've read my fair share of plays, and seen a half dozen or so on stage, I haven't seen/read anything by either of them. So that was unfair.
But lest I gain a reputation for backing down, I've heard this theatricality argument before and I've never been fond of it.
Cross-cutting is an old technique but not a necessary one: the fact that it's such a popular method of showing simultaneous action even in independent films might be evidence that we're getting something there which film audiences think has more utility than the tension/collaboration they could get from stage-style action shown with a wide shot.
Your second and in my eyes more important point about theatre seems to be a variation on Brecht's Verfremdungseffect or (going further back) the "distance" that many Romantics thought necessary for artistic appreciation. There's no doubt in my mind that theatre these days doesn't try to make events "real" in the same way as most movies (though long before movies they might have strove for verisimilitude).
But it seems to me that many movies do the same thing: don't the storybook narration of Rushmore, the postmodern fluidity of Adaptation, and even the self-conscious product-placement of Josie and the Pussycats function in the same way as the wires on your angel, drawing us out of the illusion and reminding us that this is art?
First, hi Josh--I missed out on catching up with you before Japan--sorry. But I hope it's going gangbusters. Be careful with the blowfish.
Second, Dan-- I'll grant you that films (like "Rushmore" and "Adaptation") can estrange the viewer from simple immediacy; I'll go further and grant you that "Josie et al.," or "Jurassic Park" (a pop version of product-placement sell-your-cake-and-mock-it-too I'm more familiar with) rely on viewers trained in (or suspect of) the tricks of immediacy. Fictions & prose play this game, too, from Tristram Shandy to Colson Whitehead.
Still, even certain shared meta- or artifice/formal games will work differently in Lawrence Sterne's & Vonnegut's novels, Kaufman/Jonze's & Anderson's films, and Brecht's & Kushner's plays. I'm all for playing against genre, of seeing genre as something which audiences and critics deploy (rather than something 'inside' the work, controlling how audiences might read). But I still think there's some utility in trying to name, describe, and evaluate how the different technologies for narration from page to stage to screen do have bearing on how we read.
Josh makes a good suggestion--read and view "Angels." On the other hand, your point--how films might appropriate the tricks of the trade from theater, and vice versa--is worth examining further, too. I don't think you have Lars Von Trier without Brecht, and maybe the former's work supplants--for current impact--the theatrical precedents. (But even as I say this, I think: are theater and film at war? Why either/or? Why not enjoy the enhanced, expanded pleasurable possibilities of two distinct, divergent media?)
And I still would push back on your pushing back about the generic/media distinctions between cross-cutting or simultaneity on stage, and the same on film or on the page. When two disparate sets of actions are occurring on stage, they comment on each other in strange useful ways because they're 'framed' together--they're both physically on stage. Yes, our eyes may move back and forth, trying to decide which action to focus/center on--and in that way films work just the same. But the bodies are there, there is a physical and spatial disjunction-yet-relation which film can't capture. In film, the relatively rare device of the splitscreen, aside from Mike Figgis' attempts to play with this formal device (hmmm... which does seem a challenge to my claim), tends to collapse the 'separate' spheres so that it's just one scene, really. (Most versions I can think of are phone conversations, right? The separate spheres are in explicit dialogue, and so spatial distance is undercut.) Or, when cross-cutting back and forth, film maintain 'separate' spheres--"Angels" illustrates this precisely in a scene where one member of a relationship sneaks out for an illicit rendezvous with an anonymous guy in a park, while the other member of the relationship is on a hospital bed being examined for symptoms of AIDS by his nurse. On stage, the simultaneity of bodies, especially the ordeals each set of bodies is undergoing, is very very different than the film, where we go to the different locations for each shot.
And, before I head back to grade a paper and bathe my kid (or vice versa), some other 'avant-garde' contemporary playwrights worth a look-see: Suzan-Lori Parks, David Henry Hwang, Kushner & Kushner & Kushner again, Tom Stoppard, Caryl Churchill, Martin McDonagh, and, yeah, Albee & Pinter.
It's going great Mike, thanks. I'm sorry too; wanted to meet up with you, but I only had about 3 weeks to prepare and I ended up running around all over the place.
Names names names! This reminds me of the time I bet a friend of mine that he couldn't name five contemporary poets.
Literati points for referencing Laurence Sterne and Colson Whitehead in the same sentence. I'll admit I'd never heard of Whitehead — a bit of a shock since I try to keep up to date on the up-and-comings.
Noble Joshua's suggestion of "Angels" sounds right up my alley (I'm working my way through back issues of Hellblazer at the moment) and I'll check out the TV event if I can find it. That said, I still disagree with his wry point about reading screenplays on Fridays.
We don't need to read screenplays when we can "just" watch the movie, a key advantage film has over theatre, and while there are a lot more people involved between the movie and the original script, if you've got enough theory/liquor in you and a good highbrow/middlebrow movie to chew on, there's still plenty of room for different interpretations.
Returning to your last comment reynolds, I agree that the presence of real people all on the stage together, etc., does give theater a different feel, but I think film can approximate those experiences for the audience (that it often doesn't is usually incidental) in all but a few cases. There's value in what plays do that films simply cannot, of course, and in their much cheaper production values, but given the chance I'd still wave my magic wand and turn every good play into a good movie, because I think the tradeoff would be worth it.
You say that film and theatre are different media, and ask what each can do well, but for me, theatre is more like the ancestor from which film evolved. Fetishizing the differences between the two mediums is probably the only way to justify theatre's continued survival (no bones about its popularity) — and people can like theater, I've got few problems with people who have different tastes, but this seems to go beyond that.
So many people, even people who hate theater, seem to think theater is culturally "higher" than movies, and I'm sitting here wondering, "c'mon, really, what have you done for me lately?" Maybe once I run through some of these names I'll change my tune.
Creative nonfiction assignment one: write about a hobby, passion, or sport. It's not that I don't have any hobbies or sports, mind you (though I don't have that many). I just thought that writing about The Simpsons in a relatively detached manner sounded both fun and interesting.
Like the show that spawned it, the Simpsons article has become an institution. Each year, mainstream news outlets seeking a light story turn their attention to the show, acknowledging the latest milestone, award, or clever adaptation of the series. Writers wax nostalgic about their favorite episodes, or drop names from the show�s impressive list of guest stars. Inevitably, they mention the show�s humble beginnings as a series of 30-second clips on The Tracey Ullman Show in 1987.
Meanwhile, in student newspapers and on the Internet, devoted fans of the show are expressing their discontent. �Unfortunately, in recent seasons it has become increasingly apparent that the show has passed its creative peak,� observes the McMaster University Silhouette. �Some fans have gone so far as to suggest that The Simpsons has been reaching for plot ideas and clever references for years.�
Writing for the Drexel University Triangle, Ian Pugh echoes that sentiment: ��But it could get better,� you say. Well, unfortunately, we've been saying that for four years now, which is a practical eternity when it comes to television, and a second coming seems unlikely.� Commentators at the popular newsgroup alt.tv.simpsons deride the disjointed plots and frequent guest stars in later seasons, and although there�s no clear consensus on where the trouble started, it�s clear that � rosy media profiles to the contrary � not all Simpsons episodes are created equal.
The first three seasons are slow and psychological � the creators were trying to make something more than a cartoon: this would be an �animated sitcom,� with a more realistic family than most live-action shows. Homer is a dumb but goodhearted father, and the problems he and his family face are often quite serious: characters are looking out for each other�s welfare, or wrestling with the morality of their own decisions. Some episodes make significant emotional demands on the audience. Bart cries after failing an exam, and dooming himself to another year of the fourth grade, in �Bart Gets an F.� His sister Lisa has to decide between her father and her dream in �Lisa�s Pony.� Viewers generally come away from an episode feeling that they�ve gotten to know the Simpsons a little better.
There�s comedy too, of course. No Simpsons season is without its laughs, and some of the all-time most popular episodes are in season three. The final scene of �Bart the Daredevil,� in which Homer tries (and fails) to jump the Springfield Gorge on a skateboard, became a clip-show staple in later seasons. �Flaming Moe�s� was the first of many episodes centered around the loveless bartender, and like �Homer at the Bat,� another episode common on best-of lists, it featured numerous celebrity voices. While fans now complain about the show�s reliance on celebrities � as of 2003, over 340 guest voices had appeared on the show, a Guinness world record � few object to the seamlessly integrated cameos of these early years. Kelsey Grammer�s voicework as a disgruntled clown in �Krusty Gets Busted� was good enough to earn him a recurring role on the show in what would soon become a Simpsons institution: the Sideshow Bob episode. Another idea from season two, the Halloween special, was so popular it became an annual tradition.
A few standout episodes aside however, seasons four through eight are the golden years of the series. Webmaster Jouni Paakkinen at snpp.com, the most comprehensive Simpsons site on the web, considers season four�s �Last Exit to Springfield� the best episode of all time, as does Entertainment Weekly. In a 1998 poll, the fans on alt.tv.simpsons decided that �Lisa�s Wedding,� a season six episode set 15 years in the future, was their top pick. Fondness for this period isn�t limited to the few compulsive watchers, either: even if they don�t know the actual name, many fans speak highly of the �stonecutters episode� or the �helper-monkey episode.� Most of them could sing you the �Mr. Plow� jingle, on key.
Figuring out why these seasons are so fun to watch is difficult: it�s tempting to focus on the outlandish plots (Homer training to become an astronaut, or gaining weight to get on disability; Bart and his friends taking an ill-thought-out road trip to the World�s Fair in Knoxville) and ignore the differences in the way these episodes function. The pacing is faster, for one thing, and characters are more likely to communicate with non-sequiturs and blank stares. Pop-culture references abound. The show also honed its satiric edge during this period � in �Two Bad Neighbors,� former President George H.W. Bush moves in across the street; in �Homer Bad Man� a media circus ensues after Homer is wrongfully accused of sexually harassing the babysitter. The target of �Homer�s Phobia� is obvious enough from the title.
While most viewers claim the show never �jumped the shark,� many fans place the start of the show�s decline somewhere between seasons eight and thirteen. In 2000, halfway through season eleven, recurring character Maude Flanders was killed by a T-shirt. In season thirteen, Homer literally jumped a shark during the opening credits. But for the hardcore fans, the kind of people who�ll continue to watch the show even as they complain that it�s gone downhill, the problem is �Jerkass Homer� � manic, idiotic, and thoughtless, Homer has become a caricature of his original sympathetic self.
At this point in the series, now entering its seventeenth season, the show is often self-satirizing. Even criticism gets incorporated into the show: feigning incompetence, Homer calls himself �Jerkass Homer,� and a nerd character sports a T-shirt reading �Worst Episode Ever.� Reduced to vehicles for quips and satire, the various characters seem less real. Still, the show has its charms.
The satire can surprise you. �Bart-Mangled Banner� depicts an America afraid of political dissent, and the last season had send-ups of China, American Idol, and Catholicism. There are moments of clever wordplay, like �Italian-American Mexican standoff.� Though a lot has changed over the years, ultimately people tune in each Sunday because the show is � incredibly � still funny.
Don't trust my gloss of "Jerkass Homer." I didn't invent the phrase, nor do I use it. I'd also like to add, because I didn't get room in the essay, that I think there's a split in the later seasons. There are several great episodes in fifteen and sixteen (though so far seventeen has been a dud), enough that I'm tempted to say the series is back on its feet. Anyone who doesn't believe me should watch "Goo Goo Gai Pan" or (if you're a lapsed Catholic) "The Father, the Son, and the Holy Guest Star." For what it's worth, I thought they were the funniest in years.
It's been fun, but studied indifference to my audience is ultimately antithetical to what I think blogging is all about.
I don't expect to get many comments, but I feel I should at least give people the opportunity to respond to my posts. That said, comments can be deleted (I'll work out some code of ethics there eventually) and some entries won't be open to discussion.
Is this really a creative piece of non-fiction? I get the feeling you wrote more of an essay than anything else. I liked reading it; reading your writing makes me feeling more intelligent, Dan, and you make some good points here about The Simpsons, but its nothing I haven't heard or read before from anyone who watches with any frequency.
My real question is, how do the The Simpsons apply to you? Your decision to write in a "detached manner" isn't really that fun from the reader's perspective. It makes for an interesting read, but I can apply the word interesting to almost anything; it's a neutral word that's lost most of its meaning.
That's my main worry with this class. I'm taking it primarily to become a better blogger, and yet the prof keeps telling us to "put ourselves in the background," which doesn't fly in the blogosphere.
This is supposed to be an "article" but you're calling it an "essay." My suspicions are confirmed: I've gone too far. God, why did I listen to someone so enamoured with Elements of Style?
P.S. You need a weblog, Noble Joshua. I read all the available issues of Fable a few weeks ago but had no way to discuss them with you.
Write a post on the difference between an article and essay so I don't make the same mistake twice.
That was my thought: those creative writing profs are always pushing at what could possibly be the autobiographical. Due to the longevity of The Simpsons, the simplest route would have been tying specific episodes into your life, perhaps saying that seasons 3-6 were the golden years of elementary/junior high school, while now the current seasons, their occasional high point notwithstanding, are a bit mediocre and ho-hum, much like "real life". Whether or not either statement is true is up to you. You're being creative, after all.
Maybe your prof will be pleased. But I think the point of the assignment is about what the hobby/interest says about you and not just the hobby/interest in and of itself.
I'm in the process of setting up a webpage. I'm in Japan right now and its almost a required thing to have a blog when you're abroad. And as for Fable... have you started the new Arabian Nights storyline yet?
Our semiweekly MAPH lectures have their own subtle politics, as I'm coming to learn. There are about 100 of us packed into a lecture hall, and though only four or five of us end up asking Profs Jay and Candace (sometimes referred to, à la Bennifer, simply as "Jandace") any questions, our individual seating preference express plenty.
I usually sit in the middle left section as you face the lectern. Like most of the left section, the middle-left section is populated by the late, the half-interested. The people who, on a hot day, don't care who they sit by as long as they're near the only set of windows in the room.
The back of the room is even less interested — or rather, more interested in seeming less interested. The preceptors sit back there for propriety's sake, but I can't make too many generalizations about the actual students back there. I'm still trying to figure out if there's a difference between the back and the greater middle-right section. I'm also toying with the theory that, except for the back and front sections, the only other major split is left-right, with minor pockets of variation.
The front is exactly what you'd expect: the quiet studious. They're too close to ask Jandace any questions, because they're right there. Only the people on the very ends of the front few rows seem sufficiently removed to gather up their courage.
Today I sat in the lesser middle-right section, which is ideologically aligned with the front section. The area where I was seemed lousy with writers and other would-be literati, etc. In short: most of the people I like in the program. It's also much more vocal than the front; I'd say that the majority of questions come from the lesser middle-right, with the more relaxed denizens of the greater middle-right picking up the slack.
There is something psychological about sitting in these different areas, I think. With everyone around me furiously taking notes, I felt like the laziest guy in the class. Whereas back in the middle-left I was usually taking more notes than anyone around me. Everyone was following along in their reading, too.
We're still studying Marx, and Candace opined that just because we can imagine something doesn't mean we can theorize about it. As in: we can anticipate communism but we can't make any theories about how it would actually work? Well, something like that. Candace's example was unicorns; presumably we can theorize about horses though.
It all sounded like nonsense. Someone made the astute point that there's a lot of theoretical work basing itself on imagined worlds like the Matrix. And as far as I can tell, plenty of people have been using Serenity the same way: after I see that movie with Jinx(!) tonight(!), I can finally read articles like this one.
Now, my friend Graham and I have been arguing ever since grade school — in fact our ongoing "rock vs. magnet" debate, predicated on an object that neither of us has seen for a decade, is the stuff of legend — so disagreement is nothing new. We're both rational people and there's a lot of common ground that goes unmentioned. Though we rarely seem to convince each other, it's a rare argument that doesn't end amicably.
Lately — ever since he posted on "The importance of being partisan" — it seems like I've been disagreeing with Graham more than usual. It's quite refreshing, actually, to have so many good arguments on IM.
He plays the spirited partisan and I try to moderate. Just like old times.
Graham's latest post, provoked by some needling by a Hamlinite named Justin, demonstrates both his skills as a rhetorician and his increasing (and for a moderate like myself, worrying) idealism and liberalism. A good fisking is in order.
(I don't mean "liberal" in a perjorative sense, by the by. It may be a dirty word in the rest of the Midwest, but I'm Minnesotan. We had Wellstone.)
Graham's post begins with a trick I just learned about in Writing Biography: the block quote. Did you know that most people don't actually read block quotes? We tend to skim them and accept the writer's analysis of what the quote is saying, which is great for the writer. Justin's point in the first quote is that Hamline graduates are hypocrites, and his main point in the second quote is that the three factors he mentions (public school teaching, scholarship-funded travel, envy of more financially successful conservatives) make people remain liberal after college.
I'm not saying Justin is right, just that Graham ignores those contentions at this point in his post, focusing instead on attacks peripheral to the argument. While the criticism in Justin's post applies to Graham, it clearly applies to others as well: my friend makes it personal to get our sympathy and justify his response. So we find him defending the merits of his Fulbright and saying nothing about whether it has let him remain liberal.
The next paragraph contains a textbook example of what some rhetoricians call "warrant" — a fancy word for a buried assumption that makes a connection seem logical. Graham writes:
Well, lest we get into a flamewar between pretentiousblowhard.org and livejournal, I'd just like to point out some foolishness in his post regarding liberals in the academic world.
The axiomatic warrant here (if Graham's "lest" isn't merely ironic or disingenuous) is something like this:
If I point out foolishness in Justin's post, then we will not get into a flamewar.
That's the funny thing about warrants. Sometimes they're hidden because writing them out would destroy the argument they support. Nothing about Graham's post will prevent a flamewar. Quite to the contrary, I suspect.
On to the meat of Graham's argument. I'm beginning to understand why I've never done anything like this before.
1) Are college campuses liberal because they're separated from the real world? Or because higher education draws people who take "a more complex, measured view of reality," what amounts to a "liberals are just smarter" explanation? Graham supports the latter claim, and I doubt that many in his audience would disagree.
Personally I don't think Graham's politically diametric outlook is complex or measured, but college students in general do seem more receptive to a wider spectrum of ideas, left-shifted though it may be. You'd be hard-pressed to find a group of people who take thinkers like Marx, Freud, or Peter Singer as seriously as we do. A lot of that does have something to do with our separation from the real world: we're placed in an environment where it behooves us to be receptive to a wide variety of new ideas. In a sense both Graham and Justin are right.
Take a few steps back, however, and you'll see that Graham has created a false dichotomy. Aren't there other reasons why college students would be more liberal? Some credit needs to go to the generally liberal character of most college professors, a phenomenon Peter Levine analyzes here. Another explanation is attitude amplification among likeminded individuals, which could exaggerate college liberalism. I'm taking my very first sociology class right now, but I'm sure there are still more options.
(And before anyone pats himself on the back for choosing all the liberal/wisest positions on political issues, a study by Paul Goren at ASU, which I read about in a New York Times Magazinearticle last year, found that voters typically formed their party affiliations before they knew what issues their party stands for. Why some people switch is more complicated.)
1a) As part of his contention that liberals at college campuses take a more complex, measured view of reality, Graham posits a conservative approach to sociology. As usual at pb.org, Graham can count on the sympathy of most of his audience and abandons intellectual honesty in favor of a crude, sarcastic caricature of conservatism.
Apparently all conservatives think that "personal responsibility" is the only thing sociologists need to worry about. However, while many conservatives do believe that "poor people are poor because they're lazy and need to work harder," traditionally conservative fields of study like economics have been brought to bear on sociological issues, most recently in the best-selling book Freakonomics. One of my ongoing arguments with Graham recently has been over using revealed preference to explain human decision-making; like Carry Out at Lawrence he's uncomfortable with economic measurements that show us putting a price on things like rainforests, life, and dignity.
1b) Graham goes on to claim that some conservative students hold this simplistic "personal responsibility über alles" view, and they shouldn't complain when a professor contradicts them. Such students probably exist, and I really have no objections to this scenario. I cannot praise a fugitive and cloistered virtue, etc.
I suspect that Graham believes conservative students have more trouble growing up, learning, and rejecting black-and-white conceptions of society than liberal students, who either a) have grown up already (back to the "liberals are smarter" contention) or b) have an easier time accepting the teachings of the liberal professors who predominate in certain disciplines. Obviously Graham is generalizing without evidence, but I don't necessarily disagree with b). At least one study has shown that conservatives students have it easier in conservative disciplines.
2) Graham then turns to Justin's argument that, except in the situations noted in the second block quote, graduates "either have to a) go back to suckle on the teat of malto-milky academia to remain truly liberal or b) sell out." Note that Graham doesn't mention the exceptions from the second block quote, choosing instead to oversimplify the issue. Even with the exceptions, I don't disagree with Graham, but the Justin we get here has a bit of straw, if you get my drift.
2a) In response to this claim of Justin's, Graham observes that, lacking a trust fund or comparable means of support, most "college students do feel the pressure of the pocket book" and yet "still feel 'charitable' and believe in a world that can be made better by progressive cooperative action."
Who's he talking about? I agree that many students feel economic pressures while in college, and many of my Hamlinite friends dealt with those pressures without giving in to The Man. But do you see the way he made all the struggling college students liberal?
I'll go tell my old paleo-conservative roommate to stop worrying about those student loans, right now. Presumably he's got a trust fund he doesn't know about.
Perhaps Graham didn't mean to generalize, didn't realize he was ignoring the struggling conservative (and moderate: I know at least one wishy-washy moderate who's tens of thousands of dollars in debt) students. Rhetorically, though, he's now in the excellent position of speaking for all the poor students, now presumably liberal.
2b) Now we're talking about the liberal students outside of college. There's some fine (if vague) rhetoric about many of the liberal students feeling that way until the day they die. Graham presents no figures about liberal attitudes over that time period, and I certainly have no such figures with which to refute him. But we both know that "many," like "arguably," is a journalistic weasel word that's very difficult to prove wrong. It certainly sounds nice.
Here's another passage with a nasty assumption:
And yes, many, till the day we die (even not having had sucked on academia's teat the whole time). Why is that? Because we aren't just looking at the bottom line.
The warrant?
If we don't just look at the bottom line, we are liberal.
This is an example of what I've described as Graham's increasing idealism: his equation of an economically-oriented view of the world with conservatism (you don't have to be an old-fashioned Marxist to think liberalism is compatible with economics) and his equation of that same economically-oriented view with selfishness (the Copenhagen Consensus is a good counterexample).
His juxtaposition of Justin's plastic fantastic rock 'n' roll lifestyle with that of the fair-minded liberals who "want a better world and a better country" — in Graham's black-and-white world there are few, if any conservatives who want such a thing — poses a false dilemma between utopian idealism and being a selfish jerk. Liberals stay liberal because they're better people.
2c) But some liberals end up apparently "selling out," and it's them Graham turns to now. These students have debts to pay off and saving the world doesn't tend to pay well.
i. He opines that students in traditionally liberal majors aren't obligated to "walk a narrow path of self-righteous charity work," which seems like a nonsequitar. I think Graham means that they can work for money (in Corporate America?) without soiling themselves morally. I certainly agree with this, but I don't know if it's justified by anything else in the post. Does this counter Justin's claim (blockquote one) that liberals are hypocrites? Wasn't a problem with conservatives that they didn't feel charitable obligations?
ii. The next claim, that while they might like to walk that narrow path, they can't afford to, is more palatable. Here he's not talking about a lack of obligation, he's talking about feasibility. They're just doing what they need to to survive and get out of debt.
Graham points out that both these claims (the inexplicable one and the appeal to necessity) add a complexity to his argument that was lacking in the dichotomy he attributes to Justin. Again, remember that Graham made that dichotomy by ignoring some of the complexity in Justin's argument. Another straw man defeated.
3) The end of the post contains a few proposals of Graham's for getting rid of these financial burdens, and since whatever disagreements I have with those proposals have more to do with my libertarian leanings than with a fault in Graham's last-registered-Democrat logic, I'll skip them here.
The last paragraph, with its caricatures of both factions, is a rhetorical flourish, and a rather nicely done one, reminding people of the complexity and nuance of our nation's partisan divide.
We had a handout today on the "pseudo-iterative" in Writing Biography, which is indeed the geeky narratology/rhetoric course of my dreams. The last class period we'd learned about three forms of frequency in narrative descriptions: singular (a unique event), iterative (a recurring event), and pseudo-iterative. I couldn't fathom how the latter might work, but here's the example Prof. Weiner gave us, from Marcel Proust's Swann's Way:
Often the sun would disappear behind a cloud, which impinged on its roundness, but whose edge the sun gilded in return. The brightness, though not the light of day, would then be shut off from a landscape in which all life appeared to be suspended, while the little village of Roussainville carved in relief upon the sky the white mass of its gables, with a startling precision of detail. A gust of wind blew from its perch a rook, which floated away and settled in the distance, while beneath a paling sky the woods on the horizon assumed a deeper tone of blue, as though they were painted in one of those cameos which you still find decorating the walls of old houses.
Now that's just weird, and speaking as a writer (cough) it looks very hard to do. I'll add Proust to the list of authors I'd read if they'd written something shorter. Like less than a thousand pages. Sorry, I can't make that big of commitment right now.
This is one of those courses where I have to restrain myself from talking too much. Still working on that.
As our professor said the first day, "this is not a cool course." It's not cool in the sense that most English courses are uncool, but even within the department we're doing things we shouldn't be doing, old-fashioned things. We talk about rhetoric and its pragmatic effects. Sometimes we ignore postmodernism. Someone mentioned authorial intent.
Well, someday I'll do it, and know I'm doing it, and you won't even notice.
Today we talked about four kinds of narrative structure, and again there's a type (we're calling it "mystic," not to be confused with "prophetic" structure) that I don't understand. I'm told I should examine some East Asian fiction if I'm still confused.
I love these classification systems. I dream of a narratology CCG, though oddly enough, even in my dream I'm the only one willing to play it.
I confess: I can't conceal my excitment now that Bush has nominated Harriet Miers for Supreme Court justice.
I'm a little annoyed that my guy (or in this case, girl) wasn't his pick, so maybe mere schadenfreude explains my interest. Personally, I hope not; I tend to find schadenfreude in politics (especially from the Left, now that I mention it) incredibly irritating.
No, I don't think I have a monkey in this fight. I just love the drama of a good political struggle. And the irony that this, of all things, would be what unites the country.