A very autistic Thanksgiving

I really love holidays.  Is that a surprising statement for an autistic?  Perhaps “the holidays” conjures up connotations of stressful shopping extravaganzas and noisy, harrowing dinners with grossly extended families and too much alcohol.  Neither of those has ever been a fixture of my celebrations.  What I’m a sucker for is routine and ritual, and I actively cultivate traditions that are meaningful to me, which usual involves embracing the natural season, the comforts of home, giving handcrafted gifts, and food.  So much food.

But usually, these intersect with spending the designated dates with family and close friends (translation: cooking for those people!)  And at some point I realized that events have conspired to have me partaking in Thanksgiving dinner completely on my own this year.  At first I was concerned.  I worried that I’d have trouble finding meaning in a solo celebration, would be tempted to just ignore the holiday altogether, and would end up feeling depressed and isolated.  But I soon started to reflect on how much I’ve come to appreciate time spent quiet and alone, and realized that all I needed to do was make the decision to have a simple, beautiful celebration on my own, and appreciate it for what it is.  After all, if we choose a day to contemplate gratitude, gratitude for time to oneself and one’s own good company seems like an appropriate place to start.

So I have a plan, and one I genuinely look forward to.  (Plans are my best friend.)  I will start with a clean home, and prepare a simple seasonal dinner to my own tastes: spiced apple cider, a fake turkey roast, maple baked acorn squash, cornbread dressing, garlic mashed potatoes, mushroom gravy, and roasted brussels sprouts.  (Positive aspect of cooking for one?  You don’t have to feel bad that nobody else loves brussels sprouts as much as you do.  And dessert?  Who needs dessert when you’ve got all the brussels sprouts to yourself?!)

I’ll avoid the trap of eating in front of the TV, because I do that pretty frequently and honestly, it doesn’t feel like a “real meal” to me when I’m distracted by watching something.  Instead, I’ll light my oil lamp and put on some music; Vivaldi will probably be involved.  After I clean up from the meal, I’ll take a stroll around the neighborhood and savor the autumn weather.  Then I’ll put on my fuzzy cat pajamas and curl up on the couch with my two fur babies to reread some favorite poems and short stories, and end the evening with a couple episodes of Mushishi that I’ve seen approximately 1.3 bazillion times and love all the more for the repetition.  Again, having no one else around means I don’t need to take different viewing pleasures into account.

Planning this alternative sort of celebration has helped to drive home for me how much intention and attention factor into creating comfort and meaning in my life, on “regular” as well as “special” days.  I hope that others, whether on the spectrum or not, are also finding ways to enjoy and make the most of where they are in life right now, rather than accepting the impossible burden of societal expectations about how, when, and with whom we should cultivate tradition and invoke a celebratory spirit.

How do you all deal with the holidays as pertains to disabilities and mental health?  I’d love to hear about it in the comments!

 

Hallowe’en post: The dance with death

My son has developed a recent and intense curiosity about mortality.  “I really want to know what it feels like to die,” he says to me often these days, frequently as I’m tucking him in.  “I wish I knew what happens to people.  But when you find out, then you can’t tell anyone.”  I think our roommate is a bit uncomfortable with what comes across as a morbid fixation, extending to matter-of-fact statements like “He [our cat, who purrs nearly non-stop] will stop purring when he dies,” and, of YouTubers with no recent uploads, “Maybe they killed themselves.” And yet, while clearly his conversational sensitivity is still undergoing fine-tuning, there is something extraordinarily pure in his musings.

They are not motivated by angst or disillusion.  He simply wants to know.  It seems the gist of his many queries is– Isn’t this strange?  Here is something momentous that happens to absolutely everyone, but no one can tell me what it’s like?  And what ideas can help to make sense of it?

“I believe it’s like having a long dreamless sleep and never waking up,” he says, yet unwilling to quite equate this with an idea that a person simply ends, becomes nothing.  He speaks of self, hovering between conceptions of body and brain as the whole, and of spirit that exists similarly between the dead and the unliving; the great-grandmother he barely remembers, and the stones he pulls from the dirt and arranges on shelves.  It is a dance that is remarkable to watch, and worth treading carefully around, this bit of growing up.

In my studies as a medievalist, I read and wrote about how changing religious traditions affected and reflected attitudes toward dying and the dead.  In this, as with so many issues to which it alludes, “Beowulf” presents a deceptively small window that opens onto a vast sea-change.  It begins, after all, with a lovingly wrought telling of an equally lovingly arranged ship burial, a tradition whose roots extend to prehistory– which was, we must remember, a recent state of affairs for the Germanic peoples of the first millennium– but whose demise can be concretely mapped to those peoples’ assimilation to and of Roman and Celtic Christianity.  And it ends with a corpse that is buried, but first burned.  Beowulf’s funeral is a portrait of a people grieving a dear loss and, what’s one and the same, fearing for the fate of a society whose heroes are gone.  A telling detail is that with him was sacrificed– destroyed, or at least, interred, and gone from circulation– a store of wealth: a practice the incoming Church energetically worked to replace with contributions on behalf of the departed’s immortal soul, with far-reaching economic implications.

Yes, “Beowulf” is a transitional (arguably self-consciously so) text steeped in turbulent social transitions regarding and mirroring the deeply personal transitions they contextualize.  Marriage, the forging and fragility of political alliance, the succession of leadership, the taming of wild things and places, and not least, death.  By comparison, an Anglo-Saxon living in the Global Northwest today would likely observe no collective turmoil at all with regards to end of life and burial practices.  I wonder sometimes whether, from a psychological perspective, this eases or complicates our private meanderings around mortality.  What questions would a child some 1300 years ago have asked, and what answers would she have found, and from where?

I have feared death; I have laid awake nights stricken about it.  I have sought it out, and come closer than is comfortable to talk about.  More recently, I think about the concept of “second death”– the point, hidden somewhere in the future, when a dead person is mentioned for the last time.  The time after which one will never be remembered, personally, again.  It strikes me that the diversity in the time between one’s death and “second death” is overwhelmingly more vast than the diversity among lifespans.  I consider a future lacking not only me, but memory of me.  I scan myself for an emotional reaction, and there is less than I anticipated.  The unhurried curiosity I experience is, perhaps, recognizable beside my son’s.

At some point my own death and memory came to seem really not very important to me, in sight of the death I fear of so much that is larger and more lasting and more beautiful than I am.  There was a time when I wanted my death to mean something; there were times when I wished death could erase all traces of me like a “Replace All” function with nothing in the second field.  Now when I worry, I think about waste.  Product in, product out.  Seeders, leechers, and peers.  Things worth going back inside for when the house burns down, when the house will burn down.  Is this part of not being young anymore, but not yet being old?  Is it part of loving a child?

Popular tradition, riffing more on Celtic than Germanic folk themes, and drip-brewed through centuries of Christian influence, paints in impressionistic strokes a thinning of the walls between worlds, a blurring of lines, a resurfacing or drawing near of souls past or beings beyond, on this particular day each year.  A night of spooks; a day of saints.  The emotional reasoning appeals: at no time does transition write itself in nature more than mid-autumn, when the length of daylight seems to dwindle all of a piece, and migration and foraging seem goaded on to a frenzy by certain but unpredictable descent of winter and its stillness.

It is a fine time for wood fires.  The lengthier darkness and inviting chill means more chance to sit and watch the dance of the flames against the shadow, together or alone.  There are hours yet before we need to sleep.

Getting to the good parts

People have a look that they get when you tell them you’re autistic.  I was baffled by what this meant until I started reading the literature surrounding neurotypical narratives of autism.  There’s a word that’s ever-present in those narratives, whether or not it’s actually written.  Tragedy.  A person locked in a life of isolation, rigidity, and underachievement, unable to enjoy the neurotypical norms of concepts like intimacy and intuition.  Reality not matching the ideal of a child conceived.  Many people are eager, perhaps without even realizing they are doing so, to recode the autistic as a “person with autism,”  to avow “awareness” in the name of “intervention,” to erase the identity entirely and call it inclusion.

Individuals can be forgiven for thinking this is the sensitive belief to hold.  Read about autistic experience and you learn of loneliness and frustration that is poignantly real.  But it is pervasive narratives of tragedy that forefront these truths at the expense of other equally real aspects of autism, of which the vast majority of those who know something about neurodiversity still seem largely unaware.

Being autistic can be fun.  Sure, there’s probably a whole range of pleasures neurotypicals experience that most of us struggle with or just don’t like.  But if so, the corollary is true as well.  Just like I will probably never understand what the huge god damn deal about The Godfather is, but many of the people who consider that blasphemy will never share my appreciation of Dover Beach.

1)  One more time with feeling

To start with, boredom is different for me.  I get bored really quickly with nothing to do, but I don’t get bored with things the way most people seem to.  If I like a song, I quite happily listen to it a limitless number of times.  There are books I’ve read dozens of times, and movies I’ve seen hundreds of times, to the point where I can (and do, if I’m alone) speak along with the dialogue.  These things simply don’t stop giving me pleasure.

And with each repetition, if a scene or wording or melody elicited an emotional reaction, I get the same reaction again.  Instead of getting used up, those reactions compound, and the meaning I receive is more pleasurable because it’s exactly what I expect and look forward to.  Often, the first time I experience something, I don’t know yet whether or not I like it.  Ask me after twenty replays and I’ll have a very strong opinion.

It might sound dorky, but in a very real sense, the media I add to my “playlist” are my friends.  I get excited to see them again.  I know I can count on them for good company, and they’re always there when I need them.  I’m at ease with them because I’ve arrived at a soothing degree of familiarity.  Why wouldn’t I like having friends like that?

2)  Say it again, Sam

I’m the same way with jokes, and with reactions in general.  The same joke in response to the same stimulus just gets funnier to me.  Or, more accurately, it “builds up” an additional type of pleasure that I have trouble describing, and I honestly have no way of understanding how many other people share it.  That becomes the thing to say (or hear) in response to that situation.  It fits.  It’s somewhat like those “most satisfying” videos that made the rounds a while back.

I won’t say it’s like scratching an itch, because it isn’t a compulsion.  The motivation isn’t that it would feel so wrong to not say it, it’s that it feels so gloriously right when I do.  More simply, I don’t need to, I want to.  Even if no one else is there, I’ll still make the joke.  (Or say the phrase, or do the sound effect.  Yep, the whole world has sound effects in my head.)

Just one example:  Whenever I’m getting ready for a trip, I say, “Preparations, Bursar, preparations!”  It makes no difference whether anyone’s around to hear, because that’s a throwaway line from Educating Rita that you’d be extremely unlikely to recall unless you’ve watched the movie as many times as I have, and trust me, you haven’t.  But it still makes me really happy.

I can’t say with certainty that it’s a facet of autism per se, but anecdotal evidence has told me that NTs by and large really, really don’t appreciate the amount of repetition that I do, in either reception or behavior, while many autistics know exactly what I’m talking about.

I find it hard to imagine experiencing the world without that particular form of enjoyment.  It seems like it would feel like losing one of my senses.  A little emptier and sadder.

3)  Talk nerdy to me

One of the best things about autism is special interests.  On the whole, autistics are nearly always nerds.  We tend to have very particular topics about which we are intensely curious and passionate.  Sometimes only one, sometimes a few, and sometimes they change over time.  For me, astronomy and language are my long-standing fascinations.

We want to know absolutely everything about those topics.  There doesn’t seem to be a point at which we say, “okay, enough of that.”  Often, we can mentally retain huge troves of information about them to recite at will.  I have trouble remembering the simplest things about my day to day life, but ask me who’s currently on board the International Space Station or for a comprehensive history of Mars orbiters and landers and I’ll happily tell you all about it.

What’s at least as enjoyable as devouring these endless amounts of information is sharing them.  We love to talk about our special interests.  Typical “small talk” is exhausting, but explaining far-infrared interferometry is a truly joyous occasion– perhaps even energizing.  I don’t think the reason for this is that we want to show off and appear smart.  We’re just so enthused about our learning and thinking that we love when we can let it bubble over freely, kind of like when someone falls in love with another person and can’t wait to tell the world.

Unfortunately, as with repetition, many NTs don’t appreciate hearing nearly as much about our special interests as we would like to share.  We tend not to excel at give-and-take conversation without conscious effort, and often find it difficult to know where to draw the line.  Unless we’re careful to curtail our enthusiasm, I think we can come across as tedious and rude.  This has often led me to feel embarrassed about acknowledging the peculiar extent of my passions, but in return, that makes it all the more exciting when someone shows a genuine interest and really listens.

4) Ten thousand hours

Akin to my love of repetitious media is my willingness to doggedly repeat trivial tasks until they become second nature.  When I was ten, I played Minesweeper for so much time every day that I was able to beat the most difficult level every single time in under 20 seconds.  These days, it’s sudoku instead.  In the past half decade or so I have done thousands of sudoku puzzles.  I’ve gotten to a point where, often, I don’t have to engage in the linear logic process to deduce what digits to fill in– I can just glance at the puzzle and know what to put where.

I’ve heard a theory expounded that becoming an expert at something requires putting in ten thousand hours of practice.  I haven’t the slightest idea whether that’s true in general, but I am certain that I have put more than ten thousand hours into multiple activities in my life.  Perhaps it’s unfortunate that they tend to be skills that have no serious value in the world.

But the process of a task progressing from baffling to intensely intuitive is deliciously satisfying.  It’s like cracking a code, or maybe a safe.  Maybe part of the pleasure stems from the fact that these skills are inherently crack-able, like learning the algorithm to solve a Rubik’s cube.  They are orderly, static, and ultimately entirely comprehensible regardless of the level of complexity.  That’s a comfortable space to occupy, in stark contrast to the chaotic and overwhelming processes at work in the world at large, especially those that factor in other humans.

Cue The Thing To Say:  “I’m not sure I’d know how to ‘dabble,'” quoth John Thornton in North and South.  (Goofy grin ensues.)

5)  Pain don’t hurt

In truth, I’m a total wimp when it comes to even moderate discomforts that go on for long, such as room temperature variations.  But in the case of acute pains, even pretty intense ones, they just don’t register as a big deal to me.  My understanding is that this is two-part:  On the one hand, the pain I experience just isn’t as severe as it is for most people.  And on the other, what acute pain I do feel just doesn’t faze me that much, for whatever reason.

I suppose I can’t really say that this characteristic is “fun,” since I’m not a masochist.  But what it is, is extremely useful, particularly insomuch as it grants me a degree of self-sufficiency that I cherish.  I’ve had no qualms about performing my own ear and facial piercings, which, just like cutting my own hair, has saved me hundreds of dollars versus having a professional do them, and also prevents me having a stranger’s hands on my head, which I loatheWarning: grisly examples follow; highlight at your own risk!  [I’ve stitched up my own wounds, lanced my own abscesses, and excised a small amount of necrotic tissue that I stupidly caused by suffocating a fresh piercing.Again, this means I’ve avoided several trips to the doctor that would have been distressing for me for sensory reasons.

By no means all autistics have low pain sensitivity/high pain tolerance, and it exists in NTs as well.  But it seems to be more common with us, which I venture to suppose makes sense given that it’s probably a neurological difference.  It feels to me like a pretty good consolation prize for crying if my feet are cold and feeling like I need a shower if a stranger gets touchy-feely.

 Conclusion:  I am not your crusade

Nothing I’ve written here is remotely intended as bragging.  Sometimes I joke that “autism is my superpower,” but being earnest, I don’t value these aspects of myself because I think they’re special or impressive.  I value them because they bring me joy.  I’m not a savant, a “supergenius,” or a person who’s ever likely to make a deep impact on the world.   Honestly, I was pretty far along in life before I understood that everyone didn’t have these same characteristics and preferences.

I’m extremely committed and proficient with regards to some things, and painfully incompetent at others.  Chances are, so are you, whether you’re autistic or not.  If we’re to talk about impressiveness, I’m far more in awe of people who can maintain full-time employment and have a number of close friendships.  It’s just that the world as it stands tends to applaud and compensate achievements characteristic of neurotypicality far more than autistic-associated ones.  That’s a dynamic of minority and privilege, not pathology, and it’s why our self-advocacy is pivotal.

My point, rather, is that being autistic can be a frustrating experience, but it doesn’t have to be a miserable one.  We aren’t suffering from an illness, and you shouldn’t assume you need to pity us because some of what you hold dear is beyond our grasp.  I’ve felt a lot of hurt at the things I can’t do– largely because I’m aware that they are what’s expected, and that damages my self-esteem.  But if you gave me a big red button that said BECOME NEUROTYPICAL, I wouldn’t consider pressing it for even a fraction of a second.  You couldn’t pay me enough to touch it.  This is who and what I am.  I don’t want to be someone else.  I just want the same as anyone, to be valued, loved and supported, not in spite of my eccentricities and foibles, but because of them.

We are not broken versions of neurotypicals.  Perhaps you look at us and believe we are puzzles with missing pieces, like a house without any doors.  That’s not the case.  It’s just that while you were fussing about the door, we were busy looking out the window.

 

Postmodernism’s dead, but elitism isn’t. Luckily, neither is culture.

I’m going to take a detour from the typical content of this blog, because today I found myself reading an article from Philosophy Now that so profoundly irritates me that I can’t resist detailing the reasons: “The Death of Postmodernism and Beyond,” by Exeter-educated Oxford dweller Dr. Alan Kirby. Kirby holds a PhD in English Literature, my erstwhile field of study, so I feel both entitled and compelled to disagree with him in a deeply catty way appropriate to the long history of cattiness and disagreement in our discipline.  (You should know, before starting out, that the correct way to read any quote from Dr. Kirby is to follow it with an indignant “harumph!”)

After concluding that postmodernism and its “fetishism of the author” are, indeed, dead (or at least, so senescent as to be reduced to “a source of marginal gags in pop culture”) Kirby begins a heading titled “What’s Post Postmodernism?” only to, without justification, discard that commonly-used terminology and state that he will refer to contemporary culture as “pseudo-modernism.” He gives no reason why he chooses this epithet, so I can only conclude that its primary purpose is derision– “pseudo” anything rarely sounds promising. He defines “pseudo-modern cultural products” as those that “cannot and do not exist unless the individual intervenes physically in them.”

It’s true that such “products” exist, in many forms, whatever we choose to label them. I’ll reject his mocking moniker and choose to coin the more descriptive “participatory textuality.” Mobile devices and broadband internet have, without question, opened up many new ways for culture to move from static text to exchange. However, Kirby errs hideously in going on to write, first, as though participatory textuality is the dominant, or even exclusive, mode of twenty-first century culture, and second, as though this mode is entirely novel.

Kirby reveals a strange preoccupation with, of all things, the “reality” TV show Big Brother, seeming to see it as some sort of apotheosis of contemporary media. I have never met even any avid consumer of popular media who would acknowledge it as such, and it is in fact not at all representative, not only of millenial textuality as a whole, but even of the “reality” genre. Therefore, I assume that Kirby’s insistence on its importance is due either to grasping at straws to support his curmudgeonly, stubbornly out-of-touch assessment, or to having been exposed only to bits and pieces of any twenty-first century “cultural products,” and consequently building a precarious argument on a few small points of reference.

Regardless, he makes much of the fact that unlike, say, Great Expectations (note that he chooses, here, a “classic” text, missing the opportunity to observe the fact that many works of considerably less staying power also have an independent material existence; for example, The DaVinci Code) “would not exist materially if nobody phoned up to vote its contestants off.” This is simply silly. Should the telephones of the world fail and Big Brother therefore fall from glory, we would still be left with Cupcake Wars, Real Housewives, The Bachelor and all the other “reality” shows that are much more representative and relevant. One might as well point out that newspaper advice columns would not exist if readers didn’t write in with their questions. While true, this would be a banal point that would say nothing of the nature of newspapers, publishing, or the twentieth century on the whole. Does such a narrow corner of a generation’s cultural consumption deserve to be criticized as the successor to as broad a movement as postmodernism? I don’t think so.

Connected to the format of Big Brother, Kirby asserts, is the entirety of the internet, whose every aspect, he claims, is “pseudo-modern.” He argues that “clicking on [one’s] mouse” (a peculiarly particular way of describing internet navigation) creates a “pathway through cultural products” that “cannot be duplicated,” implying that this is, in itself, a form of participatory textuality, in contrast to, say, the reading of a novel, which is meant to be done just one way. But the novel or monograph was hardly the exclusive textual form in previous decades. Newspapers, magazines, and collections of jokes, short stories, essays, and poems have all long been standard fare, and it’s difficult to believe that the typical way of consuming these texts has ever been by reading straight through from beginning to end, making the experience identical for each reader. Does it reduce the uniqueness or value of content for a reader to navigate it in a way that makes sense to them? I don’t think a feminist critique or a science journal article is diminished by its proximity to a Buzzfeed quiz, any more than a printed text is diminished because the reader gets up to do the dishes mid-chapter.  Realistically, this is how content is incorporated and embraced in our prosaic and individual lives, rather than remaining the domain of those for whom it is a profession.

Kirby also invokes the downloading and “mixing-and-matching” of individual songs as a distinct mode of consumption from “the idea of the album as a coherent work of art, a body of integrated meaning.” The buyers of “single” releases and of compilations like the “Now That’s What I Call Music” franchise, and the listeners who skipped through CDs to listen to their favorite tracks, when I was growing up twenty years ago would no doubt be disconsolate to hear that they were getting it all wrong by insufficiently appreciating the considered monographic nature of an album, and thereby nosing in on its very authorship. The demand that an album be considered only as a whole sounds like a complaint from your emo ex-boyfriend with regards to the 3-hour “rock opera” he recorded in his basement; when professional musicians contest their authorship vis-a-vis intellectual property, it’s usually to demand compensation, not that you stop listening to their songs in the wrong order. They, quite rightly, don’t see this as much of a threat to their authorial claim. The demand for creative integrity could easily be taken to extremes, and seem not much more absurd than it does here: Imagine if you were told you could only rightly read Dubliners if you proceeded post haste toward Finnegans Wake, without “mixing-and-matching” the intervening texts with, say, essays by Alan Kirby; otherwise, your boorish whims would be infringing on Joyce’s authorship.

The consumer’s tendency to “mix and match” online or downloaded content– not just music, but any media– is a far cry from negating its authorship and material existence. The vast majority of online content still takes the form of static pieces written by specific people. The author may not always be “fetishized” as anyone from Dickens to Derrida was, but then, neither, on the whole, are the authors of print newspaper and magazine pieces. Many content creators either online or in print– and often both– are still well-known and avidly followed by name, from Joel Osteen to Richard Dawkins. Kirby seems to think that content on the internet is in a wild and uncontrolled state of flux, in which any given URL provides individuals different content from click to click, second to second. While it’s true that pages can be and are edited or removed, it’s equally true that the internet publicly memorializes content, in many ways, to a much greater extent than print media. Just ask anyone who has tried in vain to get compromising content removed. Content that, if printed, consumers would have to search long and hard for in the bowels of library archives is instead available years later at the click of a “search” button, and even if it’s been removed, its material existence very often continues in the form of quotes, screenshots, and cached versions.

Nor does the ubiquitous existence of comment threads do anything, as Kirby insinuates, to infringe on material existence or authorship. People comment upon a published piece; that doesn’t mean their comments are the piece. Discussion of a text as part of its enjoyment is hardly new. Comment threads are little different from discussions among friends, book clubs, or reviews, save that they enable individuals from diverse locales and backgrounds to share and contest their reactions– is that a change for the worse? Sure, some outlets publish content created out of user input– and how exactly does this differ from the inclusion of a “letters to the editor” section, or the solicitation of independent op-ed pieces? For that matter, what is a literature class but an opportunity for “recipients” of culture to become participants as they analyze, compare, write about, and argue over extant texts? Is Dickens any less the author of Great Expectations because generations of students have tirelessly done the academic equivalent of hitting “enter” on the comments section– often with a similar degree of insight and style?

Kirby further overreaches in his attempts to illustrate the “pseudo-modernity” of the internet. He mentions, in short order, mapping and navigation applications; Wikipedia; and blogging as evidence of the authorless evanescence of the medium. This is no different from trying to define twentieth century media in terms of family Christmas newsletters, the Encyclopedia Britannica, and the Rand McNally Road Atlas– an overview that would probably yield a similar assessment.

Satellite maps materially exist, of course– though in binary, not analog, form– just as much as the pages of the Road Atlas do; their use simply varies depending on the needs of the consumer, who uses text boxes and zoom tools the way someone thirty years ago might have used a highlighter and a magnifying glass. The only difference is that Mapquest does what Rand McNally tried to do– but better. It was always the objective of maps to provide a dynamic way for readers to navigate on their individual journeys. Now, we simply do so more efficiently, accurately, and practically. And for those who consider satellite navigation too “ephemeral,” never fear; Rand McNally is still in business.

Likewise, an encyclopedia was never intended to be a static treatise. It was always contributed to by a multitude of “experts” of greater and lesser degrees, mostly unacknowledged (or certainly, few users noticed or cared who they were.) New versions, appendices, and errata were continuously released because information, unlike printed texts, does not sit still. Again, all Wikipedia does is use technology to better fulfill the function encyclopedists always intended to serve. Edits are not arbitrary, continual, anonymous, unregulated, or impermanent. Rather, a team of dedicated editors vet them, lock down certain pages, reverse spurious changes, and flag areas that need improvement. The average Wikipedia user comes to the site not to arbitrarily insert their input and leave the entire encyclopedia in aimless flux, but to peruse a particular entry in search of information– just like they would with a print encyclopedia– except that now the information is up to date, more extensive, easier to find, and without the possibility of paper cuts. Of course, not everyone is in agreement about the value of Wikipedia as a factual resource, but the debate is one of reliability, not ephemerality or attribution.

As for blogs– Kirby calls the act of “making up pages” (emphasis his) “intrinsic to the internet” (while in the same sentence asserting that “pages are not ‘authored’”)– they do indeed offer a newly effective, cheap, fast, and potentially anonymous way to personally disseminate information. They are, in that sense, more democratic than print media. But this is a quantitative, not a qualitative, difference. Xerox machines and self-publishing presses are old news. Before them, there were mimeographs and carbon copy typing. Blake’s poems and accompanying woodcuts were self-published. And a thousand years ago, scribes were laboriously copying popular texts with little to no concern for accuracy or attribution, shamelessly amending, abbreviating, and anonymyzing, while packaging their latest risque riddles along with their versions of long-standing oral tradition. Unmanaged, more or less democratic popular contribution to the cultural narrative is not novel or worrisome. It is and always has been true that anyone can write, type, or record and share anything they like, and either put their name on it or not. There is now, no less than then, and for better or worse, a recognizable hierarchy of amateur and “respectable” sources, both on and offline.

Kirby also seems to be unaware that forms of entertainment that “foreground the activity of their ‘reception’” are far from a modern (or “pseudo-modern”) phenomenon. Dance music today is, yes, meant to be danced to– of course. So are the traditional dance tunes of Gaelic and other cultures, and so were Baroque rondos and minuets. As for his assessment of “industrial pornography,” if he thinks that older publications of erotic texts and images were meant to be “read or watched” but not “used,” he is not only ignorant but delusionally naive.

In fact, the pre-”pseudo-modern” culture that he seems to hark back to is not a long-standing reality, but a lingering conservative fantasy of the Western European upper classes that erases the importance of cultural expressions that preceded, disregarded, or were outside the scope of the Enlightenment, or which its enshrined thinkers deemed uncouth. This bias is neither modern nor postmodern, but simply elitist. It looks with disdain at the organically evolving, actively participatory, and often deeply subversive media that lie outside his chosen pantheon. It’s tempting, in fact, to label such elitist overtones “pseudo-postmodernism.”

Kirby gives us a hamfisted clue to his true theoretical underpinnings when he derides the “puerile primitivism” of focus on “the acts which beget and which end life”– a painfully refined way of referring to sex and violence. Neither of these are subjects whose central importance is new to post-postmodernism. They are the blood and bones of nearly every cultural tradition, from children’s stories to aristocratic literature, with the uncommon exception of Victorian prudishness– and they are such because they are profound, intellectually and emotionally disruptive experiences that nearly all of us hold in common. It is absurd to imply that modernism and postmodernism, which trumpet such works as Ulysses (which also contains farts!) and Lolita and American Psycho, as well as a continuing fixation on Freudian sexuality, are somehow above such “puerile” concerns. Would he prefer that subject matter were confined to sexless, epistemologically troubled middle aged white men engaging in literary criticism? Or perhaps storks delivering babies to erudite, hyper-conscious Oxford couples, who later die ironically but non-violently in their beds? It’s both amusing and troubling that, while maintaining such a stance, he still lays claim to feminism and postcolonialism as artifacts of his tradition, when neither could exist without centralizing frank discussions of the uncouth aspects of oppression. Angst may be polite, but suffering is not.

In short, none of the characteristics of the digital age have gone nearly so far as Kirby would like us to believe toward making authorship “irrelevant, unknown, sidelined” or content “characterised both by its hyper-ephemerality and by its instability.” The internet has, undoubtedly, increased the amount of content that he would probably call “vacuous” and “banal,” but it has also increased the equitable dissemination of all forms of media– including, ironically, his own essay, which I assume he would deem substantive. It’s of little importance that Big Brother doesn’t hold up well as a rerun, because millenial culture, and even participatory textuality itself, is not defined by call-in “reality” TV shows or transitory memes. Authorship and stewardship are in no danger, when the vast majority of content is still scripted, attributed, and recorded. It’s also interesting that someone staunchly upholding the intellectual value of the postmodern should prove so technophobic, since early postmodernism avidly embraced the advent of media that were new, odd, and experimental, such as video and screen-printing.

Even should transience take hold to the extent Kirby fears it has, the situation will be far from as dire as he lets on. He opines, “A culture based on these things can have no memory – certainly not the burdensome sense of a preceding cultural inheritance which informed modernism and postmodernism.” This is, again, pure elitism; romanticizing of his immediate intellectual tradition; assuming that only the type of “cultural inheritance” that informed his studies, his experience and interpretation of the modern and postmodern, is of any substance. Every culture has a memory, but that memory need not be as rigidly defined as Kirby chooses in terms of monographs lovingly dusted and episodes of Fawlty Towers faithfully recorded and rewatched. Looking back on history he chooses to forget, I challenge him to study Beowulf, a literary text that is in many ways a relic of an age of “evanescent” and “non-reproducible” oral storytelling, and conclude that there is no weighty (whether or not “burdensome” is hard to say) sense of inheritance in its style, tone, subject and allusions. Or to claim that an unfilmed production of a play by either Shakespeare or Tom Stoppard is “ephemeral” and therefore valueless. “Non-reproducible” does not by any means connote “without import or effect.” Participant-recipients, not material texts, are the measure of effective communication.

Kirby concludes that, at its core, “pseudo-modernism” differs from (and pales before) postmodernism because participatory textuality “defines the real implicitly as myself” and is “of course [emphasis mine this time] consumerist and conformist,” devoid of postmodernism’s pervasive irony and questioning. I believe his mistake lies largely in attempting to compare an oddly-gathered smattering of the most commercial popular culture of today with a selection of the most intellectually influential academic and literary works of the previous century. Postmodernism did not much inform I Dream of Jeanie or Cheers, and so theoretical post-postmodernism, and with it the exciting creative and intellectual possibilities of democratic participatory textuality, are unlikely to be perceived in an analysis based on Big Brother.

The tyranny of vying religious fanaticisms and protracted warfare has, as Kirby suggests, helped to define postmodernism’s successors, as have oppression, technological advance, and global capitalism, but they have done so in no more a direct, top-down, thought-blunting fashion than imperialism, fascism, and nuclear armament informed twentieth century thought. Subversion cannot exist without something to subvert, and if something can be subverted, it will be. There is no paucity of irony, anxiety, and rebellious self-examination in millenial culture, but you will have to do better than a cursory, and arbitrary, glance to take it in. Post-postmodernism tends not toward abandonment of its predecessors, per se, but toward a reconciliation of modernist disillusionment and postmodern radical self-consciousness with the notion that sincerity, aesthetics, and personality can, applied judiciously, transcend both.

It’s not that all contemporary media take their own reality for granted; it’s that they posit a reality in the first place. Hipsters have made the postmodern a hokey fashion statement, but the backlash against hipsters is not simply philistine or reactionary; it’s a recognition that after a century of tearing down traditional lexicon of context and meaning, perhaps it’s time to develop new ones, so that the world becomes something that we can enjoy and actively shape, rather than merely play with and make virtual. I’ve often joked that class discussions on postmodern literature often amount to this exchange: “I don’t like this book! It’s dull and pretentious and badly written!” “You’re not supposed to like it. It’s not supposed to be good. That’s the whole point!” At this point, such texts and attitudes are not surprising or inventive; they have become as trite and banal as Kirby accuses millenial texts of being. Pervasive irony is no longer ironic. Ubiquitous hyper-consciousness is no longer radical.

Is it so reprehensible to grow weary of textuality that places self-questioning above storytelling, and irony above empathy? Does desiring to feel immersed and emotionally connected with a text– or at least, in a way that involves some emotion besides existential angst– truly reduce to nothing but a “trance”? Does acknowledging irony and answering it with sincerity really signify being too empty-headed to “get it”? Perhaps what new creators seek is not to reflexively define their works as genuine, but to consciously make the world, and themselves, real; to not just comment on but counteract absurdity, crassness, and homogeneity. One does not have to return to the Dark Ages or Elizabethan England to assert that the resonance of heroism or tragedy has non-ironic value. One does not have to smash the printing press to see the possibilities in participatory storytelling and oral tradition.

I do agree with Kirby on one point: Postmodernism is dead. That’s neither to be cheered nor lamented, and certainly not debated. We spent decades productively coming to terms with the fallibility of hegemonic narratives and authorial paradigms, just as we spent decades before that productively fetishizing the realistic over the romantic. It’s time for new narratives– plural, not monolithic– to evolve that build upon the gains of postmodernism in a way that is relevant to the social crises and complications of a new century and a new generation. Participatory textuality will undoubtedly play a role in their construction, as it long has in many eras and cultures; so will explorations of evolving technology; but so, I think, will material existence and authorship. It will be for future generations to evaluate the merit of this combined and diverse inheritance.

Kirby claims that what he fears is the displacement of “hyper-consciousness” with “trance,” or what scholars derisive of popular literature have long termed “escapism.” But it seems that his real anxiety stems from upheaval and polyvocality themselves. He fears what he refuses to understand– a strikingly anti-postmodern characteristic– and his attachment is much less to the strengths of modernism and postmodernism than to the reactionary and myopic prejudice that informs his understanding of both.

Six subtle forms of abuse and what to do about them

The first step toward escaping an abusive relationship is to recognize that you’re in one.  The second is to realize that you deserve and can have better.  To that end, here are some characteristics, drawing on my own experience, that should help alert you to the possibility that you are being emotionally and verbally abused.

1) “Negging”

“Negging” has emerged as one among many despicable PUA techniques.  The aggressor, generally male, gives a left-handed compliment such as “You look pretty good for your size,” or “You’d be cute if you wore more makeup.”  But this technique isn’t limited to PUAs.  Abusive partners often continue to “neg” their victims throughout the relationship, as a means of keeping his or her confidence and self-esteem too low to consider deserving a better relationship.  An example is when my ex-husband told me, “Admit it, you’re not that great.  You know people don’t find you attractive, but you never do anything to make yourself attractive.  Like those beige granny bras you wear.”  Nor does this necessarily have to apply to physical qualities.  It could also take the form of “That’s an okay degree, but it’s not from as good a school as mine” or “You’d be more attractive if you weren’t such a nerd.”  As with many things abusers do, these comments are designed to 1) increase their control over every aspect of your life, including how you look; and 2) to make you feel so self-deprecating and unworthy that you will meekly accept this control.  A supportive partner encourages healthy body image and high self-esteem.  An abusive one consistently tears you down.

2) Guilt-tripping you when you assert yourself

When you calmly and reasonably ask for something you need or criticize your partner’s behavior, do they listen and take to heart what you say, even if they disagree?  Or do they immediately become angry and accuse you of being “controlling,” “manipulative” or “selfish?”  Or perhaps they react with instant self-pity– “You’re so unfair!  I can never do anything right!  I don’t know why you even stay with me!”– implicitly demanding that you switch from self-advocacy to playing mother hen and soothing their fragile ego.  Often, you feel like you have to apologize for being unhappy.  Now, there have been times when I’ve felt ashamed and even cried about problems my partner had with me, but I have always tried nonetheless to take responsibility for my actions, apologize, and express an intent to do better.  It’s reasonable to feel sad when you’ve inadvertently made someone you love unhappy.  It’s NOT reasonable to connivingly turn the tables so that all the focus is on what you feel and not at all on your partner’s concerns.

3) Establishing a complex set of rules that you can never quite live up to

“Don’t slurp your tea like that.”  “Don’t use so much toothpaste.”  “Don’t talk to that friend of yours.” “Call me at this time every day, no matter what.” “Don’t say ‘needs washed,’ you sound stupid.”  “Don’t drink Pepsi with your cheese and crackers, experts say that’s disgusting.”  And on and on.  An abuser wants an infinite amount of control over your life, so for every hoop you jump through trying to make them happy, ten more will instantly appear, and your performance will still, always, be considered insufficient.  There will be rules you are supposed to know about without them ever being spoken.  There will be rules about things that are no one’s business but your own.  So many rules that your existence will feel like a pit of quicksand in which the more you struggle to stay afloat, the stronger will be the force pulling you down and crushing you.  Your abuser may offer elaborate explanations for these rules, excusing their ridiculous nature with tails of trauma from childhood, bad memories of other relationships, and hypersensitivity.  There is nothing wrong with speaking up about something your partner does that seriously bugs you and asking them to change it, but you need to recognize boundaries of personal freedom and not set out hurdles according to your every whim just to trip up your victim and keep them in line.

4) Withdrawing affection and “privileges” to control you

I was once locked out of my own house for having a PAP smear done without my husband in attendance.  Another time, he wouldn’t visit me until I fell in line with his political philosophy.  Another partner refused to kiss me or hold my hand unless I cried and begged for forgiveness for all the mistakes I had, of course, made, in his estimation– often having to guess at what those mistakes were, because “If you have to ask, then I don’t want to tell you.”  These behaviors are simply unacceptable.  They are manipulative and cruel.  They are, again, designed to provide leverage for the abuser to control every aspect of your life.  If they can’t get what they want by demanding and guilt-tripping, they’ll take it by force.

5) Spreading horror stories about you to other people

This is bad enough when your partner decides to cuss you out to their own friends and family, telling only their side of the story, perhaps embellished with out-and-out lies, in order to shame you with public scorn.  It’s worse when they start doing the same with your family and friends, talking to them behind your back, sharing your confidential information and making you sound like the worst person in the world.  The purpose of this behavior is to isolate you, and to further lower your self-esteem by making it seem as though the whole world shares your abuser’s low opinion of you, so that you will believe you have no one to turn to and should be thankful your abuser stays with you at all.  Isolation and low self-esteem are meant to make you desperate enough to cling to your abuser through thick and thin– mostly thin and thinner.

6) Getting out-of-control angry during disagreements

A discussion can get pretty heated and emotional without turning abusive.  But when swearing and name-calling starts (“You bitch!”  “You stupid cunt!”  “Fuck you!”) a line has been crossed.  So, too, if your partner displays ANY signs or threats of physical violence, whether it’s toward you, another person, an animal, themselves, or even an inanimate object.  It is NOT YOUR FAULT that an abuser gets enraged.  It is due to their own fucked up psychology that prevents them from being rational and empathetic.  Even if you really were the lousy partner they make you out to be, there is simply no excuse for verbal or physical violence.  If they are that desperately unhappy, the thing for them to do is not to hurt you, but to simply walk out the door and maybe never come back.  Which is exactly what you should do if anyone ever treats you this way.

If one or more of these red flags applies to your relationship, here is what you should do:

1) Run.

2) Run fast.

3) Run far.

4) Don’t look back.

5) Lock the door behind you.

That’s really all there is to it.  I’ve never seen an abuser change their stripes and make good on their apologies and promises.  You don’t need to take a chance on whether they will hurt you the same way again, or even worse.  If you still care about them, then wish them the best as you run.  The hell.  Away.

Fiction is Real

I’ve written at length about how autism inhibits my ability to connect with and keep friends and lovers.  It is not an exaggeration to say that, for me, humans are another species.  I find it impossible to engage in their rituals and rites without feeling more as though I am conducting some kind of experiment than actually fitting in.  I can count on one hand the humans I know or have ever known who are acquainted with what I feel to be the “real” me.  It’s not that I don’t try or that I don’t want to connect.  I just really don’t know how.  Never were the famous words “all the world’s a stage” more true than for an autist in a neurotypical world.

When other people’s reality feels to one like a badly written script whose lines you never learned– and often, at that, a farce– the worlds we weave with words become ever more vital and more real.  Many times they make more sense, being the conscious effort of a single creative mind rather than the cumulative confused consensus of thousands of years of superstition, cruelty, ignorance, and corrupted power.

In particular, fictional characters have a leg u on their human counterparts.  As someone who has more than dabbled in writing fiction, I can attest to the concerted effort required to bring them to life. To be believable, a character must be both complex and consistent; both flawed– even fatally– and lovable.  With the exception of “flawed”, none of these qualities are requisite for a human being.  When we read about characters, we don’t just see how they choose to act around us.  We are flies on the inner walls of their skulls.  We know– when they are written well– why they act as they do.  When they appear ignoble, they may be redeemed.  When they seem strong and stoic, they may show their vulnerability.  When they don’t say what they mean, we know it, and we know why.  They have no choice but to share with us their true selves, because by definition whatever identity we see in them is true.  In a very meaningful sense, the relationships we forge with them are more intimate than our bonds with one another.  We come to know them as we know, otherwise, only ourselves.

For all of these reasons, most of my best and oldest friends, and, embarrassing though it may be to admit, many of my early romantic interests, have been found in fictional worlds.  For any given turning point or troubled time in my life, I can point easily to the characters who saw me through it, when human acquaintances and lovers confused me, hurt me, and abandoned me.  It’s true that I often feel a deep sense of loss when I first come to the end of a given story– The Lord of the Rings and Star Trek: Deep Space Nine spring to mind– because it is an ending of sorts, but it has never meant that the characters left me for good.  They were always there for me to return to when I needed them the most.  The need to rewatch of re-read was only an opportunity to become reacquainted and to deepen my understanding.

To me, my deep connection to fictional places, events and characters underscores my conviction that autists are not inherently devoid of empathy, compassion and social meaning.  We simply relate in a very different way from neurotypicals.  We thirst for information and clarity wqhere, among humans, there is rarely any to be found.  We go through the world feeling as though someone really needs to explain it all better, more thoroughly, more precisely, and that is what a good author does.

And perhaps I am not just an autist, but a romantic and a bit of an idealist.  Perhaps my lifelong fixation on fiction is a result of this, or perhaps it has conditioned me to seek a type of connection that simply is not possible between humans.  For one thing, I thought for many years that there was someone right for me out there, who would love me through thick and thin, until I was old and grey and saggy, and who would still find me beautiful, even when I did not feel beautiful and did not love myself.  Someone for whom giving up their other options wouldn’t feel like a sacrifice.  I no longer believe in that kind of love.  I no longer expect or try to find it.

And I no longer believe that if I just try hard enough to play the game and to be kind and to take risks, I will find many of the kind of friends who care enough to know me and don’t just come and go like snowflakes on hot asphalt.  I can’t speak to other people’s realities, but I also can’t keep hurling myself at the plate glass windows of other people’s homes hoping to break through.  And each time I have crept back, bruised and battered, to my den, I have turned the page or the channel and found that my fictional friends, who wait for me, have dusted me off, set me on my feet, and reminded me of what matters.  They have taught me not to pity myself.  They have given me the solace and strength to carry on.

So to me, by every meaningful definition, works of fiction, and fictional characters, are completely and intensely real.  They exist.  They may not have bodies, and I do not believe in souls, but they have throughout my often melancholy life leapt from pages and screens to take me by the hand and guide me through dark places.  They ensure that, to paraphrase Gandalf, while I must walk in sorrow, I need not walk in despair.  And I need never walk alone.

Fear of flying

Over time, I’ve learned some of what triggers my hypomanic (and, more recently, full-blown manic) episodes.  I always seem to swing upward in the fall when the daylight starts to change, which is not something I can control, though I do try to modulate it with enforced darkness at night and a sunlamp the rest of the time.  There are other factors that I can somewhat control:  getting enough sleep and food is important, and being active, but not too active.

The problem is that managing my mood swings starts to feel like a full-time job and leaves me with not much time and energy for anything else.  First of all, what does “too active” really mean?  Last night I went to a political meetup for the presidential candidate I favor, and afterward I went to play Magic: The Gathering at someone else’s house.  I really enjoyed both, and when I got home, I felt buoyant energy coursing through me, so that I found it impossible to sleep until about 1 AM, despite taking my sleeping meds several hours earlier.  Then I had to get up early to go to a psychiatry appointment, so I ended up getting 5 or 6 hours of sleep.

Doesn’t sound like that big a deal, does it?  A couple of positive social situations and a couple of missed hours of sleep?  But it is a big deal, because all day, even though I’ve been physically exhausted, I’ve felt more and more manic.  I can’t fully express how frustrating it is that the simple act of enjoying myself and interacting with others for a few hours, or staying up late one night, can cause me to ascend into a manic state.  This state may only turn out to be a couple of weeks of giddiness and productivity, but then again, it might elevate into psychosis and put me in very real danger, as well as tax the patience of the people who have to deal with me daily.

Now, thanks to my mood, I’m in a strange state of waking without being fully present, and even though it’s past my bedtime, I’m incapable of rest.  When I try to lie down and sleep, thoughts swarm so thickly through my mind, like a plague of locusts, that I cannot stand it and must get up to distract myself– by writing this post, as it turns out.  I feel simultaneously irritable and expansive.  I want to see and feel and do everything at once, and yet I loathe everything.  Even though it’s an hour past my bedtime, I’ve opted to drink some coffee and stay awake, because the jittery energy with which caffeine endows me is preferable to being so exhausted yet agitated at the same time.

I deeply resent the fact that there are such potentially serious repercussions to this decision.  I feel like getting out of the apartment to participate in things I seriously care about and enjoy is beyond my healthy capacity.  Too much stress and stimulation.  I start to question whether I can ever have a full, satisfying life, if such minor changes to my routine can cause such a disturbance in my mood.

From there, I start to devolve into self-blame and self-loathing.  I feel that I should be able to do these things, partly because they make me happy, but also because others are able to do them so easily.  I want to contribute to society.  I want to have fun.  I want to be happy.  But my illness repeatedly robs me of achieving these simple goals.  I can’t seem to stay happy without getting too happy.  I have always in my mind the facts that I must not become psychotically manic and that a part of me still hungers for the terrible beauty that mania brings, as well as a heartwrenching resignation to the alternative of being at least moderately depressed all the time.

My euthymia (“normal” mood) is fleeting and fragile.  In the sixteen years since I first became clinically depressed, it never seems to have lasted more than a month, or perhaps six weeks.  That has happened few enough times that I can remember each discretely and count them on one hand.  Add to that a couple of weeks of (hypo)mania each year, and color in the remainder with the cold, black thrall of major depression.

So to me, my (hypo)manic breaks have always been just that: a vacation from what feels like the mundane reality of exhaustion, physical pain, tunnel vision, panic attacks, vomiting, uncontrollable crying, nearly unbearable sadness, hallucinations, fixations on death, drinking binges, and the overall feeling that a thick woolen blanket is wrapped around me, keeping me from feeling or desiring a single thing except to disappear.  I’ve grown accustomed to looking forward to the weeks when I write 200 pages or crochet five projects or exercise 3 hours per day.

Since my psychotic break, I can’t have that pleasurable anticipation anymore.  Every time I feel happy or have a positive thought, I have to check in with myself:  Am I talking too fast for others to understand me?  Am I fixating on something, especially something goal-oriented?  Am I leaping around the apartment laughing uproariously?  Does everything burn too much brighter; feel too ecstatic?  It is exhausting, and it deprives me of much of the non-mood-induced enjoyment I might otherwise experience.

In addition to my policing of myself, I must also deal with the worries of my family and best friend.  They often perceive my mania before I’m willing to admit it even to myself, and from my point of view, they hound me to sleep and eat and relax until I can’t bear the sound of their voices.  I want so desperately to scream at them to leave me the hell alone and stop babysitting me, but most of the time I remain aware that what they are saying contains truth, and that I really ought to listen.

It’s not easy to admit that you can’t trust what your own brain is telling you, and that you must rely on others to tell you what is going on in your innermost self.  If I’m honest, sometimes I do things like stay up late just to show myself that I’m an adult who can do what I please.  Not a very adult reasoning, and I’m not unaware of the irony in that.  But I’ve always been the rebellious one, asking too many questions and trying when I can to circumvent authority.  Sometimes I really want to do things that are bad for me, and sometimes it’s because I know they’re bad.  I know for sure that this is an aspect of why I continually smoke tobacco despite repeated attempts to quit.

I can’t help wondering what kind of future I can have in store if the simplest additions of socializing and contributing to society push me into unhealthiness.  I feel acutely what Stephen Fry claims in The Secret Life of the Manic Depressive, that only 20% of bipolar people are ever able to function at the level they would without the disorder.  (Although I am skeptical about that definition; who knows, after all, who or what we would be without our illness?)

I wonder whether the extent of my life’s accomplishments lies in part-time parenting, writing blog posts, and crocheting stuffed animals.  And that prospect feels hollow and despairing.  I wonder, too, what I would do should, heavens forbid, anything happen to my son.  The only reaction I can begin to imagine is to kill myself, because without him, I really have nothing to live for.

I need more from life.  I need to be able to fill my cup without it overflowing, and as of now, I evidently have not discovered how to strike that balance.  I know that for the foreseeable future I must continue to treat monitoring and regulating my mood swings as my primary goal in life, however disheartening and painful that may be.  I must accept that there is and will be no unqualified happiness for me.  All of my sunshine will carry lengthy shadows.

I’d like to finish by sharing some lyrics from a song I wrote some years ago after a thoroughly unpleasant one-night stand, which I feel captures my problems aptly:

I read the apes stood tall and walked away from the trees

With heads held higher and a sudden desire for fig leaves and apple juice

Well, I was sculpted in an ice hotel,

Far from heaven and I’ve been through hell,

And the breath of life still melts me to my knees.

Like Icarus, I always seem to fly too close to the sun.  There is a burst of glory culminating in a disastrous melting of everything that upholds me.  I either burn hot and fast, or I lay cold and dry as ash.  There is rarely any in-between.  My vigilance must not rest.

Bipolar vs. autism: a personal account

I’ve met other autists and other people with bipolar, but I’ve never met another bipolar autist.  So today I’ll address what it’s like living with both together, and what I see as the similarities and differences between them.

Let’s start with a rundown of how the two conditions are defined and treated by the medical community.  Both are diagnoses listed in the DSM, the Diagnostic and Statistical Manual of Mental Disorders, which lists all possible mental diagnoses and their diagnostic criteria.  (It’s quite the tome, but very illuminating to skim through.)

However, autism, or Autism Spectrum Disorder as the latest edition of the DSM calls it, is a mental condition but not a psychiatric or psychological one.  It is, in fact, a developmental neurological difference, listed alongside mental retardation.  An autist has hardwired differences between her brain and a neurotypical one, though there’s a lot we don’t yet understand about how this works.  (I highly recommend a look at Temple Grandin’s The Autistic Brain for interesting details on this topic.)  As such, it can’t be treated or removed by drugs or therapy.  Certainly, some therapies, like Cognitive Behavioral Therapy, can help a person become more functional, but they do this by instilling new coping skills and healthy behavioral responses, not by removing any part of the autism.

(Side note:  Some people may be more familiar with the term Asperger’s.  With the recent 5th edition of the DSM, this diagnosis is no longer in use.  It, along with Autism and Pervasive Developmental Disorder, has been reclassified in the more expansive Autism Spectrum Disorder diagnosis.  For my thoughts on this change, see The sinking of Aspergia.)

Bipolar disorder is something different.  It, along with clinical depression and related diagnoses, is a mood disorder, a subset of psychiatric disorders– a larger category that includes everything from addiction to OCD to Schizophrenia.  We know that it relates to chemical imbalances in the brain, and that a genetic component seems to exist, but beyond that, not much is understood about its cause.  We do know, though, that in many cases it responds well to pharmaceutical therapies like mood stabilizers (Lithium), anticonvulsants (e.g., Lamictal, Tegretol), antipsychotics (Zyprexa, Geodon, Clozapine), and to a lesser extent antidepressants (Prozac, Paxil, Celexa, Zoloft) as well as, in treatment-resistant cases, to electroconvulsive therapy.  Most bipolar people experience episodes of euthymia– “normal,” stable mood; neither manic nor depressed– during which it’s possible to see what their personality is in the absence of ravages of extreme moods.

Now that’s out of the way, you might notice that I use different phraseology when writing about bipolar and about autism.  This is something I feel very strongly about.  I consider bipolar a disorder I have.  I consider autism a way I am.  Sometimes, people think they’re being sensitive by using “person-first” language to talk about autism, that is, using the phrase “person with autism,” or saying “She has autism.”

To me, this phrasing is not only unnecessarily clunky, but uncomfortable and factually incorrect.  To say I have autism implies it is something separate from my innermost self, something I might (and, according to our conventions, maybe should) be “cured” of.  This is not true.  You cannot take the autism out of the autist.  There’s no way to make my brain neurotypical, nor would I want you to.  If you somehow magically did, I wouldn’t be me anymore.  My brain is my mind is my self, and my brain is autistic.  Therefore, I prefer to be called an autist, because I think it’s the term that’s the most accurate and elegant, but I’ll also willingly accept being called “an autistic” or “an autistic person.”

Also, it’s important to note that I don’t consider autism an illness or a disorder, but I do consider it a disability.  Here’s the difference:  Calling it a “disorder” implies it’s something gone wrong.  I don’t think it is.  There are things I love about it and things I’m not so fond of, but that goes for my overall assessment of myself and basically anything else.  But it is without question disabling for me, because I’m not able to live my best life in current conditions.  The world is simply not built for autists, even though a lot of it has been built by us.  It’s on those terms that disability and ability must be judged.

Bipolar disorder, on the other hand, I have no problem calling both a disorder and an illness, and when talking about it, I’m fine with person-first language, though I also accept for convenience’s sake being called a “bipolar person.”  (“A bipolar” is a bit more iffy and icky.)  I know that there is a version of me that is not either depressed or manic, because this was me for the first 11 years of my life– up until puberty and stress brought on my first depression– and it’s even still me occasionally now.  Would I cure my bipolar if the possibility were given to me?  It’s hard to say, and too hard for me to answer here.  But it is at least conceivable.  And it is also, clearly, a disability, far more so than autism in my own case.  Even if all possible accommodations were given to me, it would still keep me from living the life I desire.

So what’s it like living with both?  Confusing!  Oftentimes my autism-based tendencies are at odds with what my bipolar is telling me to do.  For example, one of the key features of autism is having special interests.  For me, these are language and astronomy.  I love reading about both of these topics, as well as learning new languages and looking at the stars.  However, often I am so depressed that I am incapable of doing any of these things.  I try to read, and the words make no sense to me.  I’ve had a beautiful large telescope for almost a year, having begged for it as a birthday present from my parents, and I’ve barely used it at all because I’ve been too depressed to get it working properly.

Being autistic also makes me particularly susceptible to stress.  I get easily overstimulated by being around too many people or in unfamiliar environments, and I don’t do well with changes to my routine or environment.  This is often at odds with my manic desire to be everywhere and do everything at once.  In addition, such stress is frequently a trigger of mood swings for me, meaning that I have to be very careful not to exceed my limitations, even when I’m feeling grandiose and exuberant, lest the overstimulation push me into full-blown psychosis or a deep depression.

Several psychologists and psychiatrists have tried to “narrow down” my diagnosis to either autism or bipolar, finding it doubtful that I truly am both bipolar and autistic, having never dealt with such a case before.  Those who saw me manic perceived my resultant confidence and doubted my autism.  Those who saw me depressed or euthymic perceived my autism and doubted that I was bipolar.  However, in Februrary of this year I underwent rigorous testing that confirmed that I do indeed meet all the criteria for both.  This may be unusual, but it is the truth.

Another difference is that, by definition, autism has been with me since my birth, or at the very least, since I was quite small.  It’s not something that gets worse with age or sets in later in life.  I was a weird kid.  I’ve always had my special interests, though they haven’t always stayed the same; I’ve always been obsessed with data sets and collecting and organizing information.  As a child, I would spend hour upon hour doing things like memorizing all the countries in the world and their capitals and reciting them in alphabetical order, or staying up all night to record all the times the furnace turned on and trying to establish a pattern.  I lived largely in my head, and wasn’t terribly concerned with making friends; when I did spend time with other kids, I wanted to tell them how we would play and get them to act out what was in my head, not interact with them as equals.

I was always somewhat moody as well, but true bipolar symptoms didn’t hit me until I was 11.  I’d been homeschooled until then, according to my preference to learn more and faster than what school could teach me, without what I thought of as the interference of dull, fussy peers.  Then right around the time puberty hit me, I started seventh grade in a public school.  It was a disaster.  Being autistic manifested almost as a learning disability– I simply couldn’t understand and follow directions, or keep my work in order to turn in.  I also started to realize how different I was from my peers, and was deeply troubled by my inability to interact with them the same way they did with each other.  I was simultaneously bored, overstimulated, awkward, and clueless.

At the same time, I began to feel more and more deeply depressed.  I began a habit of self-harm that would last for years, right up until I discovered alcohol as a destructive coping mechanism.  I hated everything about myself.  I hated that I was autistic, though I didn’t know the term then or that it applied to me.  Suicide hung like a shining star in my dreams, something I didn’t yet have the courage to do but that I felt was inevitable in the long run.  All I wanted was to escape from the horror of my life.

A few years later, I was hit by hypomania for the first time, and it was the hugest relief.  It’s not that I suddenly had all the social skills I normally lacked, but I felt like I did.  I exuded overwhelming and inappropriate confidence– my dad described me as “imperious.”  I felt full of life and laughter and excitement.  Nothing could hold me back or hold me down.  I never imagined at the time that in a dozen years this feeling would progress into psychosis that would fill me with divine light only to leave me hollow, devastated and wishing to die.

Yet throughout it all, I never stopped being autistic.  I never lost my tendency to special interests, and I never really got any better at social interactions until I started drilling myself on them as I describe in Autism and friendship.  I first learned the term “Asperger’s” when I was sixteen, and while I hesitated to self-diagnose as many do, I couldn’t escape how closely the experiences I read about mirrored my own.  It became a special secret I had with myself:  that there was something that explained why I was so different, and that I was not truly alone.  I think this may honestly have saved my life.

The bipolar diagnosis, on the other hand, came out of the blue and wasn’t applied until I was 22.  I’d been experiencing what I now know was irritable hypomania.  I would stay up for days at a time writing and writing and writing on a novel I’d been working on, but it felt both good and odd.  I tried to talk to my then-husband, who in response was, as usual, verbally and emotionally abusive, so without his support I sought counseling.  The therapist I saw gave me my first look at the DSM description of Bipolar II and without hesitation I said, “Yes!  That’s me!”

I tentatively brought up Asperger’s, as it was then known, with her, but she brushed it off, assuming that it was an either/or problem, not a both/and one.  So I never brought up the topic with a professional again, until in 2012 I saw a new and very excellent therapist, who suggested it to me of his own accord.  “What do you know about autism?” he asked me, and I cautiously said, “I know some.  I feel like it describes me.  But I don’t want to self-diagnose.”  So he gave me a reading list and some website addresses, and the end result was that I came out knowing for certain that I was both autistic and bipolar.

I am lucky now to have both a therapist and a psychiatrist who accept who and what I am, and try to be sensitive to my unique needs.  I wonder sometimes if there are others out there struggling with the same dichotomies that have long troubled me.  I know now that developmental differences and mental disorders are not mutually exclusive, but can add up to something that is special and brilliant; devastating and debilitating; confusing and congruent.  I am one among many but also one among few.  Bipolar and autism have worked in tandem to make me who I am today, but I view them very differently, and think they should be discussed on very different terms.

One is an illness.  One is a brain difference.  Both have contributed to who I am today.  A bipolar autist; an autsitic bipolar person.  The important thing to me is that I understand both, and can with that understanding move on to a better life, which I would never have had, had I listened to those who presented them to me as an either/or.  The brain is a complicated thing, a combination of hardwiring and chemicals and habits, and my brain demands a special understanding that few have been willing to tolerate.  It’s been hard to learn to understand this, but always worthwhile.

I hope that this exposition can help both those who are bipolar and those who are autistic, as well as, perhaps, the tiny subset who, like me, are both.  There is truly no end to the variety of types our brains endow us with.  Wherever on any spectrum you lie, there is beauty and pain, life and death, ignorance and awareness.  And knowing who you are is the first step toward both change and acceptance.

Autism and friendship

I wrote recently about how disabilities, and autism in particular, can affect romantic relationships.  So I thought it was time to address the related topic of how being an autist has affected my ability to form and maintain friendships.  I’ll focus on adult friendships here, because I think they are of a fundamentally different nature from childhood ones.  I may blog more in the future about my experiences as a child.

As a teenager, I was a confirmed loner.  I simply lacked the skills to form even the most basic of friendship bonds, and most of the time I wasn’t really interested in spending time with others, anyway.  But I realized that I needed to be able to relate to people better to succeed in life, and also to avoid humiliating myself, which has always been one of my biggest fears, perhaps because it’s happened so many times.  I did get lonely, and more than that, I desired approval– validition– confirmation that I wasn’t as unlovable as I felt.  So I devoted myself, over the course of several years, to watching and learning, and by my early twenties, I had developed enough skills that I was able to engage in relatively standard ways– up to a point.

Shortly after my son was born, I moved to a new city, and via receiving breastfeeding counseling and taking my son to a toy lending library, I encountered some other parents with values similar to my own.  I was invited to a support group, a playgroup, and several birthday parties, and this led on to occasional one-on-one playdates.  Being a parent greased the wheels:  We had something designated acceptable and mutually interesting to talk about, and when there was an awkward silence because I was lost, it was easy to redirect attention to something the kids were doing.

Nevertheless, such socializing remained a nerve wracking and exhausting experience for me.  I looked forward to it, but I also dreaded it.  It was hard to remember what things were rude or too blunt to say; how often to nod, smile and say “uh-huh” to show that I was listening; how much it was okay to talk about my special interests without seeming weird; when it was time to talk without interrupting.  And I never did get the hang of eye contact (and still never have.)  If I tried, I simply stared, which made it appear I was being creepy or romantically interested, and it made me uncomfortable anyway, so I just gave up on that part.

Partly because of these difficulties, I never really considered my “mommy friends” to be real friends– more like acquaintances.  We spent time together because of two factors: our childrens’ ages and our parenting style.  I liked some of them a lot and wished I could get to know them better, but I didn’t know how, and frankly, they didn’t show any interest or make any effort, so I felt that I would be imposing on them if I tried.

In some cases, this was really deeply hurtful.  In one case, I met two women at the same time as they met each other.  I chatted with each of them, and I went on to spend time with both.  I really liked both of them.  However, it was clear that they “hit it off” with each other far more than they ever did with me.  It went on that while I would have the occasional playdate with either of them, they became very, very close.  I knew this because I heard each of them talk about the other, and because I was privy to their Facebook interactions.  They went for impromptu walks together.  They called each other on the phone just to talk, and talked about their feelings.  They invited each other to family gatherings.  They talked openly about how much they “loved” each other.  (Note, this was a platonic relationship; both were involved with members of the opposite sex.)

I never did any of these things with either of them.  I wanted to.  But I didn’t, and I didn’t know how.  I cried about it.  I drank because of it.  And I felt pathetic for doing both.  I didn’t know when it was and wasn’t okay to call someone, or what to say if I did.  I didn’t think it would be okay for me to talk to them about my feelings, because they didn’t with me.  If I messaged them and suggested a walk or lunch on the same day or the next day, they invariably had something else planned (or said they did.)

At one point, I found out on Facebook that they were having a crafts group with several other mutual acquaintances.  Now, I was and am one of the most crafty (in the sense of making crafts, not of being manipulative) people I know, and I knew that they knew this because I would make things to give to them.  Yet they never invited me to this group, and I was at a loss as to what to do.  Could I invite myself?  Was there any point, if they clearly didn’t want me there?  I was devastated, not so much by not getting to go but by feeling left out and unwanted.  I blamed myself– probably rightly?  maybe not?  I don’t know even now– for being awkward and unlikeable.  It seemed like a confirmation of the way I’d judged myself all my life.

One of the things about autism is that it doesn’t just make you awkward, it makes it impossible for you to know what other people are thinking.  You spend all your time wishing fervently that people would just tell you what they want from you.  I constantly thought, “What exactly am I doing wrong?  Why won’t they just tell me if I’m being rude or inappropriate?  Or if they just don’t like me that much?”  I expect they were sending out plenty of cues that I just didn’t have the capacity to read– my experience with watching, mimicking and practicing had taught me how to go through the motions, but not really how to understand the content.

After that experience, I distanced myself further from my “mommy friends,” because I felt alienated and unwanted.  For several years, I really didn’t have any friends at all.  I never saw or talked to anyone except family and my abusive romantic partner.  Then I went back to college and entered a new relationship, and both of these introduced me to new people who seemed to have some interest in spending time with me.  By this point, I’d further developed my skills; I understood roughly how to “hang out” and play video games or make some somewhat stilted small talk.  I finally managed to accumulate a handful of people I’d call, as the Brits might, “mates,” if not close friends, and interacting regularly with them allowed me to relax some, be myself a bit, and feel like we shared an actual connection.

That’s basically where I am now.  But I’ve continued to experience difficulty and disappointment.  I generally find that people do not take the initiative to spend time with me or call me, and that I must therefore always do so; even when I do, I frequently fail to get any response.  In some cases, I’ve interacted fairly regularly with a person, only to have them suddenly stop responding at all, or blow me off every time.  This confuses me.  Again, I am left wondering, am I doing something wrong?  Do they really not like me that much?  My best friend, my former Person of Interest, gets irritated when I worry about this, and tells me, “That’s just how friendships are.  It takes hard work, and most people don’t put in the effort.”  Okay, so if that’s the case, how do they ever manage to spend time with people other than me?  How do I develop a close enough bond that they will actually think of reaching out to me of their own accord?  How can I be sure that the problem isn’t actually that I’m screwing up in some way?

So my message to fellow autists is, keep trying, and things will get better; they won’t be perfect, but don’t feel alone when it doesn’t always work out or you have trouble making and keeping the friendships you’d like.  It’s not your fault that you can’t read people.  It doesn’t make you a bad or worthless person if you do get rejected, or think you are.  There are loads of us out here who feel the same.

And to the neurotypicals who care about autists, please, just be straight with us.  If we’re acting inappropriately or just in a way you don’t like, say so– not in a mean way, mind you, but with kindness and care.  Interpersonal relationships are harder for us than you can know, but that doesn’t mean we don’t want them and want your approval.  Maybe take the time and put out the extra effort to let us know if you do want us around, because otherwise, we’ll likely assume you don’t.  Chances are we’ve been hurt a lot, and when you just exclude or ignore us without giving us a reason, we really don’t understand why.

Ten things never to say to mentally ill people

One of the frustrating aspects of having a mental illness is dealing with stuff people say, even though they often mean well.  Here are just a few examples of things I suspect most mentally ill people are used to– and very tired of– hearing.

1) “But you seem so normal!”

Perhaps this is meant as a compliment, but honestly, it feels much more like an accusation– “you can’t really be sick.”  People with many forms of mental illness go through patterns of remission and relapse, so how you see us on any given day may be far from representative.  For example, I am currently euthymic (in a “normal” mood phase) but a month ago I was hearing voices from the heavens and believing I was a chosen conduit, and a week after that I was seeing rotting corpses hanging from my ceiling.  Not “normal”, right?  Also, many of us are great, sometimes to our own detriment, at putting on a happy, calm face around others, when we are being ripped apart inside.  And finally, sometimes this “normalcy” is the result of a treatment regimen that we have to work hard to settle on and stick to, which is not something “normal,” i.e. healthy-brained, people can really understand.  Don’t judge us by what you see; listen to what we have to say about our experience instead.

2) “Have you tried natural remedies?”

This is a tough one, because people who say this genuinely think they’re being helpful.  However, it’s important to remember that many psychiatric disorders are very difficult to medicate.  In the case of my illness, bipolar disorder, upwards of half of patients are classified as “treatment resistant.”  Many people spend years working with professionals to develop a successful treatment regimen of drugs and other therapies that are scientifically proven to work.  This can be a tenuous balancing act of avoiding side effects, toxicity and drug interactions.  In most cases, “natural” remedies are not only unproven but possibly unsafe, especially in combination with prescription drugs.  Keep in mind that whatever you have to suggest, we’ve probably heard of it before, and it can be deeply frustrating to be bombarded with well-meaning suggestions from people who are not experts in psychiatry.  Please, leave clinical treatment to the professionals.  If you want to be helpful, try instead asking what you can do to support the person.

3) “At least it’s not cancer!”

This is a deeply offensive and dismissive statement, even if it’s intended to make someone feel better.  Implying that psychiatric disorders are less serious or destructive than physical ones is not only unkind but flatly inaccurate.  Mental illness not only ruins but, all too frequently ends, people’s lives.  According to Stephen Fry’s outstanding documentary The Secret Life of the Manic Depressive, only 20% of people with bipolar ever become fully functional, while 50% will attempt suicide and nearly half of those will succeed.  Those are abysmal numbers for any illness.  Furthermore, we deal with many of the same difficulties that physically ill people do:  drug side effects, exhaustion, and physical pain, just to name a few.  Dismissing the severity of our problems makes us feel worse, not better.

4) “My ______ had that and here’s how they got better.”

I remember a specific conversation I had years ago in which a woman I barely knew told me in detail about her father’s undiagnosed mania and how he was able to “talk himself down” from it, and that therefore drugs and therapy aren’t really necessary, but actually inhibitive of self-help.  Another told me how fish oil had cured her brother’s depression; these are just a couple examples of the same basic conversation I feel like I’ve had a million times.  It’s problematic because no two mentally ill people, even those with the same diagnosis, are alike.  Our treatment needs to be based on scientific evidence obtained through clinical trials, not anecdotal evidence you’ve gathered from your limited frame of reference.  Assuming that our illness takes the same course as that of someone else you know (who may or may not have even had or needed the same diagnosis) keeps you from understanding what our experience really is and what we actually need.

5) “Everyone seems to have that nowadays!”

Short answer:  No.  They don’t.  You probably have that impression because of the way clinical diagnostic terms are bandied about inappropriately, for example, saying of someone emotional and moody “She’s so bipolar” or “borderline,” or of someone shy and awkward “He’s so Aspie.”  (Although autism is not actually a mental illness, this pertains.)  There are also many people who self-diagnose, to varying degrees of accuracy, often without understanding the reality of living with a severe disability.  While many disorders are more common than you might expect– in the case of bipolar, it affects about 1% of the population, so that if you have several hundred facebook friends, at least a few probably have it– they are not catch-all terms for every difficulty and behavior problem you perceive.

6) “Have you prayed/meditated/sought spiritual help about it?”

As an adamant, skeptical atheist, I particularly resent this one, but I would still resent it were I religious.  Frankly, my spiritual life, unless I choose to share it with you, is none of your damn business.  Carl Jung’s waxing on about the need for religion notwithstanding, there is absolutely no scientific evidence that prayer, meditation and other such practices can either treat or cure mental illness.  Some people may find comfort in spirituality, and that’s great, but it doesn’t take the place of medication and professional therapy, nor is it a necessary component of treatment.  My illness should never be a pretext for you to proselytize or cast moral judgment on me.

7) “Think about people worse off than you.  Lots of people would kill to be where you are.”

Again, this is deeply dismissive and hurtful.  It gives the impression that you clearly do not understand the depths of our pain and difficulty.  You are judging us by external factors like first-world citizenship, economic stability, and family support, which are all wonderful things but do not do away with or even necessarily diminish the anguish and dysfunction with which we live every day.  If you think my life is so great, I invite you to switch brains with me any day and see how you like it.  Hint:  You won’t.

8) “That’s a form of genius!”

Movies like The Aviator, about OCD sufferer Howard Hughes, and A Beautiful Mind, about schizophrenic John Nash, have contributed to this perception.  It’s true that people with many mental illnesses, such as bipolar and schizophrenia, can be highly creative, thinking in ways that a healthy brain rarely does.  However, such illnesses– and often the drugs that treat them– are equally likely to hold back mental function.  Some people with mental illness are very intelligent and creative.  So are some people with healthy brains.  And some are not.  Mental illness is, well, illness, nothing else.

9) “You’re really just an addict.”

Many, many people with mental illness, myself included, struggle with substance abuse as a form of self-medication.  And in many cases this does make matters worse.  However, it’s crucial to recognize that the substance abuse is a result, not a cause, of the underlying mental problem– a damaging coping mechanism to which we turn in absence of other effective treatment.  For more on this scientifically supported model of addiction, I highly recommend the book The Sober Truth, by Lance Dodes, and particularly chapter five, titled “So, What Does Work to Treat Addiction?”

10) “This famous person had/has that and was a great success, so you can be too!”

Recall again the numbers I stated above on the outcomes for people with bipolar disorder.  I am no expert on the numbers for any other disorder, but I do know that a few outstanding cases– again, like Howard Hughes and John Nash, and also like Stephen Fry, Richard Dreyfus and others– are far from representative.  There are a multitude of external and internal factors that determine whether someone with mental illness can create a fulfilling, successful life, and for many they are not advantageous.  Asserting this is no different from saying that because some people who drop out of school become rich and famous, anyone who drops out can be.  In some cases, such limitations prove not to be an obstacle to success, but in most, they do.  By positing illness as part and parcel of success rather than as a disability, you dismiss the pain and frustration most of us experience when we fail, so frequently, to measure up to the expectations of both society and ourselves.

In conclusion

It’s understandably hard not to feel like you’re walking on eggshells when talking to and about disabled people.  Everyone makes mistakes when trying sincerely to be helpful; that doesn’t make you a bad friend or human being.  However, a few simple things to keep in mind can reduce the chances of you saying insensitive, hurtful things like the ones I’ve listed.

Listen to us.  Don’t speak for us, and don’t assume.  You are not an expert on our experience, and your active listening and informed support is far more valuable than anything you could say.  Don’t dismiss us.  You don’t need to try to make us feel better about our situation; this feels patronizing, and furthermore, downplaying our suffering is insulting, not supportive.  And lastly, be ready to be corrected, and to apologize.  Don’t take umbrage if we say “that’s not accurate” or “that’s not helpful.”  We have the right to stick up for ourselves and to educate you about what you don’t understand.  In general, just let us guide the discussion.  Ask questions, pay attention to the answers, and don’t be full of yourself.  If you can do that, we will get along just fine.