Postmodernism’s dead, but elitism isn’t. Luckily, neither is culture.

I’m going to take a detour from the typical content of this blog, because today I found myself reading an article from Philosophy Now that so profoundly irritates me that I can’t resist detailing the reasons: “The Death of Postmodernism and Beyond,” by Exeter-educated Oxford dweller Dr. Alan Kirby. Kirby holds a PhD in English Literature, my erstwhile field of study, so I feel both entitled and compelled to disagree with him in a deeply catty way appropriate to the long history of cattiness and disagreement in our discipline.  (You should know, before starting out, that the correct way to read any quote from Dr. Kirby is to follow it with an indignant “harumph!”)

After concluding that postmodernism and its “fetishism of the author” are, indeed, dead (or at least, so senescent as to be reduced to “a source of marginal gags in pop culture”) Kirby begins a heading titled “What’s Post Postmodernism?” only to, without justification, discard that commonly-used terminology and state that he will refer to contemporary culture as “pseudo-modernism.” He gives no reason why he chooses this epithet, so I can only conclude that its primary purpose is derision– “pseudo” anything rarely sounds promising. He defines “pseudo-modern cultural products” as those that “cannot and do not exist unless the individual intervenes physically in them.”

It’s true that such “products” exist, in many forms, whatever we choose to label them. I’ll reject his mocking moniker and choose to coin the more descriptive “participatory textuality.” Mobile devices and broadband internet have, without question, opened up many new ways for culture to move from static text to exchange. However, Kirby errs hideously in going on to write, first, as though participatory textuality is the dominant, or even exclusive, mode of twenty-first century culture, and second, as though this mode is entirely novel.

Kirby reveals a strange preoccupation with, of all things, the “reality” TV show Big Brother, seeming to see it as some sort of apotheosis of contemporary media. I have never met even any avid consumer of popular media who would acknowledge it as such, and it is in fact not at all representative, not only of millenial textuality as a whole, but even of the “reality” genre. Therefore, I assume that Kirby’s insistence on its importance is due either to grasping at straws to support his curmudgeonly, stubbornly out-of-touch assessment, or to having been exposed only to bits and pieces of any twenty-first century “cultural products,” and consequently building a precarious argument on a few small points of reference.

Regardless, he makes much of the fact that unlike, say, Great Expectations (note that he chooses, here, a “classic” text, missing the opportunity to observe the fact that many works of considerably less staying power also have an independent material existence; for example, The DaVinci Code) “would not exist materially if nobody phoned up to vote its contestants off.” This is simply silly. Should the telephones of the world fail and Big Brother therefore fall from glory, we would still be left with Cupcake Wars, Real Housewives, The Bachelor and all the other “reality” shows that are much more representative and relevant. One might as well point out that newspaper advice columns would not exist if readers didn’t write in with their questions. While true, this would be a banal point that would say nothing of the nature of newspapers, publishing, or the twentieth century on the whole. Does such a narrow corner of a generation’s cultural consumption deserve to be criticized as the successor to as broad a movement as postmodernism? I don’t think so.

Connected to the format of Big Brother, Kirby asserts, is the entirety of the internet, whose every aspect, he claims, is “pseudo-modern.” He argues that “clicking on [one’s] mouse” (a peculiarly particular way of describing internet navigation) creates a “pathway through cultural products” that “cannot be duplicated,” implying that this is, in itself, a form of participatory textuality, in contrast to, say, the reading of a novel, which is meant to be done just one way. But the novel or monograph was hardly the exclusive textual form in previous decades. Newspapers, magazines, and collections of jokes, short stories, essays, and poems have all long been standard fare, and it’s difficult to believe that the typical way of consuming these texts has ever been by reading straight through from beginning to end, making the experience identical for each reader. Does it reduce the uniqueness or value of content for a reader to navigate it in a way that makes sense to them? I don’t think a feminist critique or a science journal article is diminished by its proximity to a Buzzfeed quiz, any more than a printed text is diminished because the reader gets up to do the dishes mid-chapter.  Realistically, this is how content is incorporated and embraced in our prosaic and individual lives, rather than remaining the domain of those for whom it is a profession.

Kirby also invokes the downloading and “mixing-and-matching” of individual songs as a distinct mode of consumption from “the idea of the album as a coherent work of art, a body of integrated meaning.” The buyers of “single” releases and of compilations like the “Now That’s What I Call Music” franchise, and the listeners who skipped through CDs to listen to their favorite tracks, when I was growing up twenty years ago would no doubt be disconsolate to hear that they were getting it all wrong by insufficiently appreciating the considered monographic nature of an album, and thereby nosing in on its very authorship. The demand that an album be considered only as a whole sounds like a complaint from your emo ex-boyfriend with regards to the 3-hour “rock opera” he recorded in his basement; when professional musicians contest their authorship vis-a-vis intellectual property, it’s usually to demand compensation, not that you stop listening to their songs in the wrong order. They, quite rightly, don’t see this as much of a threat to their authorial claim. The demand for creative integrity could easily be taken to extremes, and seem not much more absurd than it does here: Imagine if you were told you could only rightly read Dubliners if you proceeded post haste toward Finnegans Wake, without “mixing-and-matching” the intervening texts with, say, essays by Alan Kirby; otherwise, your boorish whims would be infringing on Joyce’s authorship.

The consumer’s tendency to “mix and match” online or downloaded content– not just music, but any media– is a far cry from negating its authorship and material existence. The vast majority of online content still takes the form of static pieces written by specific people. The author may not always be “fetishized” as anyone from Dickens to Derrida was, but then, neither, on the whole, are the authors of print newspaper and magazine pieces. Many content creators either online or in print– and often both– are still well-known and avidly followed by name, from Joel Osteen to Richard Dawkins. Kirby seems to think that content on the internet is in a wild and uncontrolled state of flux, in which any given URL provides individuals different content from click to click, second to second. While it’s true that pages can be and are edited or removed, it’s equally true that the internet publicly memorializes content, in many ways, to a much greater extent than print media. Just ask anyone who has tried in vain to get compromising content removed. Content that, if printed, consumers would have to search long and hard for in the bowels of library archives is instead available years later at the click of a “search” button, and even if it’s been removed, its material existence very often continues in the form of quotes, screenshots, and cached versions.

Nor does the ubiquitous existence of comment threads do anything, as Kirby insinuates, to infringe on material existence or authorship. People comment upon a published piece; that doesn’t mean their comments are the piece. Discussion of a text as part of its enjoyment is hardly new. Comment threads are little different from discussions among friends, book clubs, or reviews, save that they enable individuals from diverse locales and backgrounds to share and contest their reactions– is that a change for the worse? Sure, some outlets publish content created out of user input– and how exactly does this differ from the inclusion of a “letters to the editor” section, or the solicitation of independent op-ed pieces? For that matter, what is a literature class but an opportunity for “recipients” of culture to become participants as they analyze, compare, write about, and argue over extant texts? Is Dickens any less the author of Great Expectations because generations of students have tirelessly done the academic equivalent of hitting “enter” on the comments section– often with a similar degree of insight and style?

Kirby further overreaches in his attempts to illustrate the “pseudo-modernity” of the internet. He mentions, in short order, mapping and navigation applications; Wikipedia; and blogging as evidence of the authorless evanescence of the medium. This is no different from trying to define twentieth century media in terms of family Christmas newsletters, the Encyclopedia Britannica, and the Rand McNally Road Atlas– an overview that would probably yield a similar assessment.

Satellite maps materially exist, of course– though in binary, not analog, form– just as much as the pages of the Road Atlas do; their use simply varies depending on the needs of the consumer, who uses text boxes and zoom tools the way someone thirty years ago might have used a highlighter and a magnifying glass. The only difference is that Mapquest does what Rand McNally tried to do– but better. It was always the objective of maps to provide a dynamic way for readers to navigate on their individual journeys. Now, we simply do so more efficiently, accurately, and practically. And for those who consider satellite navigation too “ephemeral,” never fear; Rand McNally is still in business.

Likewise, an encyclopedia was never intended to be a static treatise. It was always contributed to by a multitude of “experts” of greater and lesser degrees, mostly unacknowledged (or certainly, few users noticed or cared who they were.) New versions, appendices, and errata were continuously released because information, unlike printed texts, does not sit still. Again, all Wikipedia does is use technology to better fulfill the function encyclopedists always intended to serve. Edits are not arbitrary, continual, anonymous, unregulated, or impermanent. Rather, a team of dedicated editors vet them, lock down certain pages, reverse spurious changes, and flag areas that need improvement. The average Wikipedia user comes to the site not to arbitrarily insert their input and leave the entire encyclopedia in aimless flux, but to peruse a particular entry in search of information– just like they would with a print encyclopedia– except that now the information is up to date, more extensive, easier to find, and without the possibility of paper cuts. Of course, not everyone is in agreement about the value of Wikipedia as a factual resource, but the debate is one of reliability, not ephemerality or attribution.

As for blogs– Kirby calls the act of “making up pages” (emphasis his) “intrinsic to the internet” (while in the same sentence asserting that “pages are not ‘authored’”)– they do indeed offer a newly effective, cheap, fast, and potentially anonymous way to personally disseminate information. They are, in that sense, more democratic than print media. But this is a quantitative, not a qualitative, difference. Xerox machines and self-publishing presses are old news. Before them, there were mimeographs and carbon copy typing. Blake’s poems and accompanying woodcuts were self-published. And a thousand years ago, scribes were laboriously copying popular texts with little to no concern for accuracy or attribution, shamelessly amending, abbreviating, and anonymyzing, while packaging their latest risque riddles along with their versions of long-standing oral tradition. Unmanaged, more or less democratic popular contribution to the cultural narrative is not novel or worrisome. It is and always has been true that anyone can write, type, or record and share anything they like, and either put their name on it or not. There is now, no less than then, and for better or worse, a recognizable hierarchy of amateur and “respectable” sources, both on and offline.

Kirby also seems to be unaware that forms of entertainment that “foreground the activity of their ‘reception’” are far from a modern (or “pseudo-modern”) phenomenon. Dance music today is, yes, meant to be danced to– of course. So are the traditional dance tunes of Gaelic and other cultures, and so were Baroque rondos and minuets. As for his assessment of “industrial pornography,” if he thinks that older publications of erotic texts and images were meant to be “read or watched” but not “used,” he is not only ignorant but delusionally naive.

In fact, the pre-”pseudo-modern” culture that he seems to hark back to is not a long-standing reality, but a lingering conservative fantasy of the Western European upper classes that erases the importance of cultural expressions that preceded, disregarded, or were outside the scope of the Enlightenment, or which its enshrined thinkers deemed uncouth. This bias is neither modern nor postmodern, but simply elitist. It looks with disdain at the organically evolving, actively participatory, and often deeply subversive media that lie outside his chosen pantheon. It’s tempting, in fact, to label such elitist overtones “pseudo-postmodernism.”

Kirby gives us a hamfisted clue to his true theoretical underpinnings when he derides the “puerile primitivism” of focus on “the acts which beget and which end life”– a painfully refined way of referring to sex and violence. Neither of these are subjects whose central importance is new to post-postmodernism. They are the blood and bones of nearly every cultural tradition, from children’s stories to aristocratic literature, with the uncommon exception of Victorian prudishness– and they are such because they are profound, intellectually and emotionally disruptive experiences that nearly all of us hold in common. It is absurd to imply that modernism and postmodernism, which trumpet such works as Ulysses (which also contains farts!) and Lolita and American Psycho, as well as a continuing fixation on Freudian sexuality, are somehow above such “puerile” concerns. Would he prefer that subject matter were confined to sexless, epistemologically troubled middle aged white men engaging in literary criticism? Or perhaps storks delivering babies to erudite, hyper-conscious Oxford couples, who later die ironically but non-violently in their beds? It’s both amusing and troubling that, while maintaining such a stance, he still lays claim to feminism and postcolonialism as artifacts of his tradition, when neither could exist without centralizing frank discussions of the uncouth aspects of oppression. Angst may be polite, but suffering is not.

In short, none of the characteristics of the digital age have gone nearly so far as Kirby would like us to believe toward making authorship “irrelevant, unknown, sidelined” or content “characterised both by its hyper-ephemerality and by its instability.” The internet has, undoubtedly, increased the amount of content that he would probably call “vacuous” and “banal,” but it has also increased the equitable dissemination of all forms of media– including, ironically, his own essay, which I assume he would deem substantive. It’s of little importance that Big Brother doesn’t hold up well as a rerun, because millenial culture, and even participatory textuality itself, is not defined by call-in “reality” TV shows or transitory memes. Authorship and stewardship are in no danger, when the vast majority of content is still scripted, attributed, and recorded. It’s also interesting that someone staunchly upholding the intellectual value of the postmodern should prove so technophobic, since early postmodernism avidly embraced the advent of media that were new, odd, and experimental, such as video and screen-printing.

Even should transience take hold to the extent Kirby fears it has, the situation will be far from as dire as he lets on. He opines, “A culture based on these things can have no memory – certainly not the burdensome sense of a preceding cultural inheritance which informed modernism and postmodernism.” This is, again, pure elitism; romanticizing of his immediate intellectual tradition; assuming that only the type of “cultural inheritance” that informed his studies, his experience and interpretation of the modern and postmodern, is of any substance. Every culture has a memory, but that memory need not be as rigidly defined as Kirby chooses in terms of monographs lovingly dusted and episodes of Fawlty Towers faithfully recorded and rewatched. Looking back on history he chooses to forget, I challenge him to study Beowulf, a literary text that is in many ways a relic of an age of “evanescent” and “non-reproducible” oral storytelling, and conclude that there is no weighty (whether or not “burdensome” is hard to say) sense of inheritance in its style, tone, subject and allusions. Or to claim that an unfilmed production of a play by either Shakespeare or Tom Stoppard is “ephemeral” and therefore valueless. “Non-reproducible” does not by any means connote “without import or effect.” Participant-recipients, not material texts, are the measure of effective communication.

Kirby concludes that, at its core, “pseudo-modernism” differs from (and pales before) postmodernism because participatory textuality “defines the real implicitly as myself” and is “of course [emphasis mine this time] consumerist and conformist,” devoid of postmodernism’s pervasive irony and questioning. I believe his mistake lies largely in attempting to compare an oddly-gathered smattering of the most commercial popular culture of today with a selection of the most intellectually influential academic and literary works of the previous century. Postmodernism did not much inform I Dream of Jeanie or Cheers, and so theoretical post-postmodernism, and with it the exciting creative and intellectual possibilities of democratic participatory textuality, are unlikely to be perceived in an analysis based on Big Brother.

The tyranny of vying religious fanaticisms and protracted warfare has, as Kirby suggests, helped to define postmodernism’s successors, as have oppression, technological advance, and global capitalism, but they have done so in no more a direct, top-down, thought-blunting fashion than imperialism, fascism, and nuclear armament informed twentieth century thought. Subversion cannot exist without something to subvert, and if something can be subverted, it will be. There is no paucity of irony, anxiety, and rebellious self-examination in millenial culture, but you will have to do better than a cursory, and arbitrary, glance to take it in. Post-postmodernism tends not toward abandonment of its predecessors, per se, but toward a reconciliation of modernist disillusionment and postmodern radical self-consciousness with the notion that sincerity, aesthetics, and personality can, applied judiciously, transcend both.

It’s not that all contemporary media take their own reality for granted; it’s that they posit a reality in the first place. Hipsters have made the postmodern a hokey fashion statement, but the backlash against hipsters is not simply philistine or reactionary; it’s a recognition that after a century of tearing down traditional lexicon of context and meaning, perhaps it’s time to develop new ones, so that the world becomes something that we can enjoy and actively shape, rather than merely play with and make virtual. I’ve often joked that class discussions on postmodern literature often amount to this exchange: “I don’t like this book! It’s dull and pretentious and badly written!” “You’re not supposed to like it. It’s not supposed to be good. That’s the whole point!” At this point, such texts and attitudes are not surprising or inventive; they have become as trite and banal as Kirby accuses millenial texts of being. Pervasive irony is no longer ironic. Ubiquitous hyper-consciousness is no longer radical.

Is it so reprehensible to grow weary of textuality that places self-questioning above storytelling, and irony above empathy? Does desiring to feel immersed and emotionally connected with a text– or at least, in a way that involves some emotion besides existential angst– truly reduce to nothing but a “trance”? Does acknowledging irony and answering it with sincerity really signify being too empty-headed to “get it”? Perhaps what new creators seek is not to reflexively define their works as genuine, but to consciously make the world, and themselves, real; to not just comment on but counteract absurdity, crassness, and homogeneity. One does not have to return to the Dark Ages or Elizabethan England to assert that the resonance of heroism or tragedy has non-ironic value. One does not have to smash the printing press to see the possibilities in participatory storytelling and oral tradition.

I do agree with Kirby on one point: Postmodernism is dead. That’s neither to be cheered nor lamented, and certainly not debated. We spent decades productively coming to terms with the fallibility of hegemonic narratives and authorial paradigms, just as we spent decades before that productively fetishizing the realistic over the romantic. It’s time for new narratives– plural, not monolithic– to evolve that build upon the gains of postmodernism in a way that is relevant to the social crises and complications of a new century and a new generation. Participatory textuality will undoubtedly play a role in their construction, as it long has in many eras and cultures; so will explorations of evolving technology; but so, I think, will material existence and authorship. It will be for future generations to evaluate the merit of this combined and diverse inheritance.

Kirby claims that what he fears is the displacement of “hyper-consciousness” with “trance,” or what scholars derisive of popular literature have long termed “escapism.” But it seems that his real anxiety stems from upheaval and polyvocality themselves. He fears what he refuses to understand– a strikingly anti-postmodern characteristic– and his attachment is much less to the strengths of modernism and postmodernism than to the reactionary and myopic prejudice that informs his understanding of both.

Parenting: the endless social experiment

I was nineteen years old when my child entered my life (twenty when he reluctantly saw daylight) and without a moment’s notice I went from a messed up young lower-class drudge to something a whole lot scarier:  a Parent.  Suddenly, a new life depended wholly on my choices.  That’s the obvious part.  But as he’s grown, something infinitely more complicated and just as inescapable has crept in.  A parent is the only philosopher in the world in the eyes of her young child, and whether she chooses to accept it or not, that responsibility is equally as heavy as survival.

It started with that word so many parents dread:  Why?  Suddenly, as a sharply perceptive and insatiably curious two year old, my son wanted to know why it was raining; why the dinosaurs died; why humans walk on two legs; why playdough gets hard when you leave it out; why mommy needs to be alone so much; why carrots are orange… the list is endless.  And the trite answers were never enough.  Without fail, he repeated the question until a point at which the succinct, formulaic responses were exhausted and we entered the territory of theory, critical thinking, scientific reasoning– in essence, of belief, doubt, knowledge and meaning.

I have with such frequency indulged in this kind of questioning myself that it came as a shock to me how exhausting and disconcerting it was to be faced with such demands every day, by a person who still couldn’t put his own shirt on or count to twenty.  I realized that however often we might think philosophically, in order to function in the world we wear blinders the vast majority of the time, allowing ourselves the kneejerk assumptions that keep us from existential paralysis.  To be faced with an eager mind that has not yet formulated these assumptions is a momentous and kind of terrifying task.

I didn’t have a tried and true method of handling this job, so with much trepidation I faced it as a sort of endless experiment.  We would have to teach each other.  I would have to listen to what he needed and what he could understand; there was no possibility of preparing some kind of lesson plan to guide him through life.

Furthermore, no matter how intellectually open I tried to be, it was literally not possible to give him answers that wouldn’t guide him in a certain way of thinking.  I could be idealistic about it and convince myself I was setting him on the right path; I could alert him to the subjectivity of my explanations; I could feel cynical and try to ignore my limitations, but none of that would change the basic fact that I was molding my child into the person he will become.  I am the game master.  His responses are his own, but the cues, the board he moves on, are mine.

It’s my job to not just say what comes into my head, but to calculate how it will affect him, and monitor his response.  Sometimes it’s distasteful.  Sometimes it feels artificial and arrogant.  But I remind myself that it’s better than the alternative, which is to attempt, self-deludedly, to deny my power and let my words push him where they will.  I hate to judge others, but I see parents attempt that “strange denial” every day, and I see the chaos and confusion in the faces of their children; the parroting back of heartbreaking, thoughtless attitudes.  I do not want this.

He walks sleepily out of his room in the morning and finds me plucking stray eyebrow hairs by the bathroom mirror.  He wraps his arms around my kneecaps and asks what I’m doing.  I tell him.  He asks the requisite question.  I hesitate.  A multitude of answers flood over me.  Because I feel obligated to by society’s norms of what and how a female-bodied person should be?  Because when I didn’t, I was teased and put down for “looking like a butch dyke”?  Because I spend time looking in the mirror disliking what I see and agonizing over the best ways to make myself palatable?  Because I can’t excise the part of my mind that insists shapely eyebrows are objectively beautiful?

All those statements are true; but are they the truths I want him to carry with him?  No.  They are not ideas with which he is equipped to deal.  They serve no purpose to him, not now.  They are not shameful, but they are sad truths, and ones he not only can’t yet understand, but should not have to.  I settle on another truth, one that is kind, to him and to myself in his eyes and mine.  “Because I choose to,” I tell him; “I like the way they look better this way, and it’s fine to look however you choose to.”

This registers in his gaze.  “I want my hair to look like Shaggy,” [i.e., from Scooby Doo] he says, grinning.

Another day, he asks me why I have so many piercings, and I give him roughly the same answer.  “Can I get rings in my ears too?” he asks.  “Not now,” I say.  “That’s not a decision kids can make.  But when you’re eighteen, you can get any piercings you want.”  He considers this.  “Well, I don’t want to,” he says decisively.  He never wanted to.  He just wanted to know where he stood.  Sometimes, he is experimenting with me, as well, and I’m okay with that.

We are in a diner, indulging my craving for hashbrown potatoes with jalapenos and onions, and he is coloring a picture of two children buckled into a car’s backseat.  One of them, who has Princess Leia buns, is becoming blue with orange hair; the other, with glasses and a baseball cap, is getting orange skin.

“Will his hat be blue?”  I ask.  He doesn’t hesitate to think about this, but I immediately do, and become self-critical.  Since when do I buy so wholly into gendered signifiers?  Why would I impress such an assumption on him?  I try to make a quick save of the situation.  “Or her hat,” I amend.  “We can’t really tell, can we?”

“This one is a boy,” he says definitively.


He shrugs.  “Because I decided he is.  They are brother and sister.”

The results of my inadvertent experiment are confirmed.  There must have been uncountable instances in which I unthinkingly perpetuated similar gendered, archetypal ideas; so have all the other adults in his life.  Regardless of how he chooses to frame it, he understands that cartoon people with glasses and baseball caps are male; those with buns are female; and male-female pairings are by default siblings.  I have taught him to assume ideas that sicken me.  And I have proven that I still hold those assumptions much more than I’d like to admit.

The good part is:  Lesson learned.  Now I am more cognizant.  When we encounter characters whose sex and gender are not specified, I either use gender-neutral pronouns, or I challenge common practice by defaulting to feminine ones.  I’d be the last to say make any kind of gendered assumption, with or without signifiers, is an end goal, but in the interim it’s a way to compensate for the conditioning he’s clearly already internalized.

I watch him to gauge the results.  The change is slow, maybe imperceptible.  But I try, and I watch and listen.  That’s all I can do.  That’s all any of us can do.  The important thing is to know we’re doing it.