…I want to peer down into that darkness, and see what’s there — to immerse myself in American magic and dread… And, equally, to induce in my readers the vertigo that comes from gazing too long into the cultural abyss — then give them a loving shove, right over the edge.
— Mark Dery, I Must Not Think Bad Thoughts
Gone are the halcyon days of consumer culture, when everyone watched the same TV spots, was mesmerized by the same dishwasher detergent, and coveted the same big-finned Coupe de Villes. Blotting the old mass media from view is the ever-spreading cloud of the internet and its byzantine “network culture,” which Mark Dery, writer and cultural critic, anticipated in the nineties. Dery, an early thinker on cyber- and technoculture, galvanized academic interests in cyberfeminism and Afrofuturism (a term he coined). He was, as Bruce Sterling points out in the introduction to Dery’s new book, “a prophet who predicted the past.” And now he’s moving on.
Dery has ditched the cyber beat, diving headlong into the grotesqueries of the “American Gothic,” which he defines as “the stomach-plunging drop from reassuring myth to ugly truth — the distance between our dream of ourselves and the face staring back at us from the cultural mirror.” His recent collection, I Must Not Think Bad Thoughts, is an excursive grouping of cultural postmortems, reports from the dimmest recesses of American consciousness.
His task today, as he’ll be the first to point out, is complicated by a growing legion of citizen-critics born of American Idol, a tangle of comment threads, and the insta-judgment encouraged by Reddit and Facebook. Mass media culture, it seems, has grown and adapted; it has assimilated into our lives and therefore become less susceptible to critique — even as cultural criticism abounds. The critical examination of virtually anything takes place virtually everywhere, the analysis of pop culture becoming as commodifiable as the pop culture it considers. This is the logical extension of Dery’s 2007 essay “World Wide Wonder Closet,” in which he remarks with some horror that the “Gothic network culture” is upon us.
These, in other words, are some very Deryesque days.
We meet in the home office of Editor-in-Chief Tom Lutz, where Dery asks for coffee, “black as night.” His sheened black hair is combed back, and he’s dressed in his trademark attire — a black suit and, underneath, a dark striped shirt. The subject of fashion, incidentally, is where our conversation begins.
Mark Dery (MD): There’s a short dissertation to be written about the fashion failures of writers these days. For instance, every time I see a photo of David Foster Wallace wearing that bandana on his head, it makes my inner Don Draper cringe. What kind of age do we live in where our best writers dress like Little Steven from the E Street Band?
Mike Goetzman (MG): DFW had some justification for the bandana though, no? He was a profuse sweater, and it was a more reasonable alternative to carrying around a towel...
MD: I thought the justification was that he was a tennis player, the sort of literatus who’d channel the Tennis Bum Within even if he were being inducted into the Ordre des Arts et des Lettres. You know, there was a time when writers knew how to dress. Look at Oscar Wilde in his lecturing costume. He didn’t dress that way all the time, of course; that was concocted. He was an early pioneer of self-branding. He called it his “lecturing costume” for a reason: it was his attempt to brand himself as an aesthete. It was very shrewd because all the New York newspapers and magazines of the day, such as Puck, had these caricatures of him in his velvet knee breeches swanning around on stage with a long-stemmed lily; it was an instantly recognizable image, no less so than Colonel Sanders’s string-tie.
MG: We see that sort of branding in celebrity culture, though not much from our celebrity writers. Generally speaking, I think there’s been a shift away from any outwardly showy sartorial displays, toward a more dressed-down, self-consciously “authentic” look, but one that is just as meditated as, say, Oscar Wilde’s costume.
MD: It’s fascinating to me that you’ve observed this dressing-down as a trend because, generationally, I’m clueless about it. Was it inspired by DFW, this self-effacing, flannel-wearing, Kurt Cobain-esque persona?
MG: We’d probably have to find the original historical locus of people’s obsession with seeming authentic, which is anybody’s guess. But, yeah, you could probably make the case that this most recent round of Urban-Outfitter sincerer-than-thou hipsterdom has its roots in nineties grunge and slacker culture.
MD: Wilde himself famously said, “It is only shallow people who do not judge by appearances,” which sounds to me like a shot across the bow of the flannel-shirt gang in the never-ending kulturkampf between Authenticity and Artifice — grunge and glam, the Dandy and the Working-Class Hero. Surface is depth, Wilde is saying, or at least it can be; the way a man wears his hat is a many-layered signifier, articulating his aesthetic philosophy. And aesthetics is profoundly political. Style is politics, Wilde argues; form can be its own content.
I was on a panel recently with a writer who shall remain nameless but who had just been written up as the greatest writer in the Pacific Northwest (or close to it). I showed up in my usual drag and was astounded to see that he had a plaid shirt on, tail untucked and flapping, and what appeared to be a crew-necked white T-shirt underneath it — which, for my generation, is something only old men do... Um, I see that you… you’re sporting one. It’s a very odd fashion move, one that scans, to Men of a Certain Age, as geeky — Felix Ungerian in its fastidious attention to hygiene.
MG: So you would never wear an undershirt.
MD: No, never. Just a shirt. So now everyone your age wears an undershirt? Have men started carrying cloth handkerchiefs again?
MG: No... Not yet, at least. I never knew the undershirt was considered a fashion faux pas.
It's a fashion statement straight out of Bogart movies. (Although Bogart would have worn a wifebeater, naturally.) In any event, such questions are interesting to me because one of the reviews of the book, the only one that was virulently negative — uniformly, thoroughgoingly, irretrievably negative! — suggested that as I am in fashion, so I am in style. That’s my elaboration of the writer’s point, I should note, but he was clearly arguing that I’m out of step with your generation; what seemed to really nettle him was that I wasn’t sufficiently self-deprecating — not that my voice, as a writer, was too intrusively flamboyant, or whatever, but that I even had a voice.
MD: I associate this critic’s response, and the seemingly universal embrace of DFW as the Voice of the Post-Ironic Generation, with changing modes of masculinity — a neurotic, self-effacing stripe of masculinity that seems to be tuned to the same frequency as, say, Jedediah Purdy’s For Common Things: Irony, Trust, and Commitment in America Today and DFW’s social-satirical critiques of Dennis Miller-ian irony in Infinite Jest and his essay “E Unibus Pluram: Television and U.S. Fiction.” I am, as you know from the book, very interested in metrosexuality and heteroflexability, post-metrosexuality — the entire continuum of polymorphous variations on American masculinity. So I have no problem with neurotic, hand-wringingly introspective, confessional masculinity. But the association of that with a certain literary style, an aesthetic politics that rejects irony for what strikes me as an unexamined, perhaps even reactionary embrace of Authenticity (with a capital “A”), is interesting to me. What am I missing?
MG: I think it’s the result of a sea change, if that’s not too strong, in contemporary writing over the past decade or so. That introspection, the recursive doubling-back onto your own thoughts, the footnote upon footnote, the tendency to pre-empt any criticism by trying to embed any and all possible criticism within the work itself — all of these have become hallmarks of a certain young, predominantly male, bloggerly style.
MD: Well, that forensic strategy — the time-honored rhetorical tactic of launching pre-emptive strikes against all conceivable counterarguments — impresses me, ironically, as somewhat macho. At least, it’s a kind of theory-jock macho — the pugilistic, polemical style of the proverbial Semiotics Major from Brown. I had the impression from you (perhaps wrongly?) that a slackerish self-doubt was inherent to this post-Wallace aesthetic, but what you’ve just described isn’t the Montaigne who asks “What do I know?” and whose essays are as much process as polished artifact, an attempt (essai) to tease out the tangled threads of what he thinks; it’s closer to the bareknuckled Oxford debating style of Hitchens or Dawkins or Harris.
What are the literary hallmarks of this “young, predominantly male, bloggerly” aesthetic? The reviewer I mentioned seemed to bridle at what he saw as an imperious, puckish hauteur in my writing. (“Puckish” in the sense of “What fools these mortals be,” not just wry playfulness.) He seemed to be rankled by what he perceived as the constant ironizing in my work, which he seemed to think of as a sort of nineties, Dennis Miller-for-Derrideans phenomenon. Contra that, he seemed to insist on a foot-dragging earnestness.
Sincerity seems to be in the air again. It’s appalling. What is this a rebellion against, either as an aesthetic movement or a historical dialectic? A refutation of, or reflexive recoil from, the legions of snark monkeys in their sniper’s nests at Gawker? Does it stand against terminal ironizing, a consumer culture where we live our mental lives in air quotes within air quotes. What are the politics of style, here?
MG: It depends on how far you choose to look back. Like the New Sincerity movement that began in the eighties, which DFW captured in “E Unibus Pluram,” arguing that the real rebels of post-postmodernism will be those that reject irony and risk the eye-roll, the charge of sentimentality, the “Oh, how banal.” But you’re not alone in your indignation. Maud Newton recently wrote a semi-controversial piece in The New York Times about this trend toward the aw-shucks, I-could-be-wrong-here approachability of many writers now. She pins it on big names like DFW and Dave Eggers, who she believes have inspired legions of nascent novelists and bloggers to adopt a similarly slangy, self-conscious appeal. Zadie Smith says this all stems from Generation Y’s desire to be liked — in the Facebook sense of the word.
MD: It’s too often forgotten that there’s always a historical pendulum swing between authenticity and inauthenticity, between a love of artifice and a crushing earnestness. It seems to me that you need both; each is, as all philosophical base pairs — hierarchical dualisms, binary oppositions — always are, defined in terms of its opposite, and therefore a kind of conjoined twin, unthinkable without its Other. I mean, you never have thoroughgoing artifice, and you never have utter sincerity and earnestness. Which is why the critically unexamined fetishization of Authenticity with a capital “A,” inevitably attended by the Jedediah Purdy-style Hunting of the Snarky, really does bring out the Mencken in me. I mean, the killing earnestness of the thing makes it impossible to stifle a derisive guffaw; it’s like the death of Little Nell! And the telegraphing of this literary politics with the scruffy beards and the untucked Pendeltons and the confessional paroxysms and the tail-chasing self-examination and the attempt to go meta upon meta upon meta about yourself, you know, is, on the one hand fascinating because it’s like the famous Beaumont medical experiment where you’re observing digestive processes in someone’s belly through an actual opening in his stomach. As a real-time X-ray of a cognitive process, of the mind at work, it’s completely fascinating. But hardly A New Thing Upon the Face of the Earth.
MG: You seem to be much more rhetorical and forensic in style than this crop of “sincerity” writers. Your essays are often more polemical than meandering and discursive, though there are also those where it’s clear that you don’t necessarily know what you think and you’re sort of beckoning the reader to take the dive with you.
MD: Right: unspooling Ariadne’s thread as we go, so we can find our way back out. I feel that I dip into both wells, as you say. When Montaigne talks about the essay, one of the things he’s emphasizing — in addition to the use of himself as a prism for the refracting of trenchant observation and thought about the moment he lives in — is anatomizing his cognitive processes as they happen, which for me is the quintessence of the essay.
MG: That review you mention might have been touching on the theoretical distance you seem to maintain, even when you turn to scrutinize yourself, like in the collection’s final essay “Cortex Envy.” The misconception, I think, is that the theoretical distance precludes any vulnerability, where, actually, it’s an extremely personal essay — you’re parsing yourself — but, on the page, the vulnerability is not as palpable as what we’ve grown accustomed to. That there is no breakdown in style accompanying the breakdown in ego smacks of inauthenticity.
MD: I reflexively (and intensely!) dislike the idea that the personal essay is all about psychology instead of intellect, and that we proceed on the plank of sincerity and that the way you know I’m sincere, what ratifies my confessional credentials in your eyes, is that I “lose” it, or at least engage in the Iowa-Workshop equivalent of psychodrama — a rhetorical simulacrum of primal screaming, if not the real thing. [This] reminds me of one of the great hard-boiled lines, I think it’s in the Maltese Falcon, when at one point Sam Spade’s girlfriend chides him for always being too glib and he says, “What do you want me to do, learn to stutter?” That’s my response to this periodic historical return of the fascism of sincerity. Why must emotional honesty, or the anatomization of the self, or the interrogation of his own mind by the skeptical inquirer, be enacted at a formal level by rhetorical devices handed down from the Transcendentalists or the Beats or DFW or whoever is your chosen standard-bearer for sincerity and authenticity?
MG: In one of your essays, “Apocalypse Culture and the Escalation of Subcultural Hostilities,” you maintain that late capitalism has made it really difficult to be transgressive or shocking. Seeing as your cultural beat is often seen as the transgressive, lunatic fringe of America, where does that leave you?
MD: Well, in that essay I argue there has been a consistent waning of the transgressive because of its recuperation by and absorption into the Spectacle, market capitalism, consumer culture, what have you. Ironically, this is one of the complaints of the hipster literary demographic that has become my whipping post in this conversation. What they might find most pernicious about postmodernism, the so-called cultural logic of late capitalism (there’s a grad-school flashback for your readers!), is the suggestion that every ritual of resistance, every act of rebellion through style, every subcultural transgression, is almost instantly identified by marketers, branders, cool-hunters, focus-groupers, and demographers, then appropriated by the marketing armature of the culture and turned into a mass-marketed signifier of rebellion.
Once upon a time, I staked my claim at the far fringes of transgressive subcultures as a sort of subcultural ethnographer, a Martian anthropologist delving deep into what made these subcultures tick and then, in a sense, explaining that to mainstream America — but simultaneously making the Clifford Geertzian participant-observer move: going native and examining these subcultures in the way that, say, Erik Davis did in his writings about Star Trek fandom. He wrote a marvelous essay called “Klingon Like Me” about the ways in which people have appropriated the Star Trek cosmology and grafted their own beliefs onto it, creating a mash-up of official narratives and personal mythologies that would be utterly unrecognizable to Gene Roddenberry. Obviously, that day of being the guy that tells you about a vanishingly obscure subculture devoted to duct-tape mummification and pleasure dungeons for crush fetishists or what have you is well and truly over, because all this stuff is just a mouse-click away, now. None of this stuff is so irretrievably weird, so vanishingly obscure that you can’t dial it up on the web within an instant. And now there are just so many websites whose job it is to swim through the bathypelagic depths of the web like baleen whales gliding along and devouring in their enormous maws the krill of information weirdness. I mean, look at Boing Boing: thousands of posts a week ferreting out the most unimaginably weird or obscure stuff. Any reasonably culturally literate graduate student today knows about things that would have required serious sleuthing-out in the nineties, before the web went public. The cultural critic who explains it all for you was a real niche in the media ecology of the times, at least in terms of the excavation of transgressive subcultures.
MG: To be a cultural critic or semiotician today seems infinitely more complex a task than it was just a half-century ago when, say, Barthes was on the job. Today, media culture has grown and evolved, contributing to our lives as much as we contribute to it — through social media, blogs, amateur criticism, all ideas with which you grapple in your essay “World Wide Wonder Closet.” How do you manage as a cultural critic in a world where we don’t have commonly shared images or signs; where everyone, regardless of whether he or she is one, considers him- or herself a critic?
MD: It’s an absolutely fascinating and well-put question, and in my inimitable fashion I’ll answer it by not answering it, which I think the ghost of Roland Barthes would smile wryly at. (Parenthetically, I love Barthes, not least because he’s a useful historical corrective: he reminds us that, not so long ago, people doing cultural theory were perfectly comprehensible to the reasonably educated, culturally literate reader — as always, an elite, but a far larger one than can make sense of, say, Gayatri Spivak. In light of the thermonuclear mushroom cloud of theory and the long shadow it cast over graduate seminars in the nineties, this is a point worth making, I think. I mean, a number of the pieces in Mythologies and The Eiffel Tower began life as essays in the popular press, as did Umberto Eco’s Travels in Hyperreality.)
Barnstorming for Bad Thoughts, doing online discussions and interviews for the book, I’ve been fascinated and appalled at the reflexive recoil, in some quarters, to the term “cultural critic.” In tech culture, in particular, I’ve been reminded how, well, “benighted” might be too pejorative, but how historically oblivious a certain species of reader is to this thing called critical theory, codeword “postmodernism,” which to them is a blizzard of impenetrable academese, what the English call “French fog”: Lyotard, Baudrillard, Derrida, Lacan. (Of course, as I said earlier, some critical theory is encrypted in obscurantist jargon. But some of the best, such as Barthes and Benjamin, is not.) But discovering that the term “cultural critic” — unremarkable in academic discourse — is so polarizing and precipitates such howls of incredulity in so many quarters has been a shock to me. I just did this two-week book-tour-of-the-air, so to speak, on the venerable virtual community The WELL — a kind of Winesburg, Ohio of the Net, if you can imagine Sherwood Anderson’s dysfunctional small town repopulated with libertarian Deadheads who can hack — and it was like the burly brawl in Matrix 2. Something in geek DNA does not love the cultural critic. I felt like Neo fending off an endlessly self-replicating army of Agent Smiths.
One of the very vexed questions in this head-butting session was the unalloyed presumptuousness and face-slapping arrogance of calling oneself a cultural critic. To a certain species of techno-literate, engineering-minded, civil-libertarian-of-cyberspace type, the term is just self-evidently preposterous. I mean, Joan Didion did cultural criticism, Susan Sontag did cultural criticism, Umberto Eco, Roland Barthes, and, yes, Zizek and even Chuck Klosterman. So did Lester Bangs at his most exalted, so did Tom Wolfe, so did Norman Mailer, so did Leslie Fiedler, so did Marshall McLuhan, so did Orwell in some of his essays, and of course Montaigne was a cultural critic par excellence, in my book (though many academics would debate that assertion to the death, since cultural criticism is typically theorized as beginning with the Frankfurt Marxists). There’s cultural criticism for every taste and niche on the ladder of lowbrow, middlebrow, highbrow and, as John Seabrook was at pains to inform us, nobrow. So the idea that you tar yourself with the brush of elitism by using that term is just eye-crossingly weird, to me. I never imagined it would be such a stumbling block for so many people, but you know, because this is America and we’re all clever and classless and free, the idea of setting yourself up as a cultural critic is like marching into battle wearing Captain Crunch’s cartoon epaulets. It’s seen as inescapably Little Lord Fauntleroy-ish. Which brings us, by twists and turns, back to Wilde’s velvet knee-breeches and the politics of style or, more exactly, style as politics.
MG: So do you see this distinctly American resistance to the term “cultural critic” as of a piece with American resistance to anything polysyllabic, any style too “purple,” anything, in short, that smacks of intellectualism?
MD: Yes, I’m always struck by the vigor with which guardians of conventional wisdom police the boundaries of acceptable style. Hofstadter’s Anti-Intellectualism in American Life was never more relevant than now. People’s radiators boil over when they’re confronted with a certain set of aesthetic decisions and stylistic hallmarks, all of which are associated with a political platform — political in the sense of aesthetic politics — which is about the embrace of artifice, which is about irony and cynicism as a legitimate philosophical stance going all the way back to Diogenes, a sort of mordant, social-satirical, ironic embrace of the Real America, to hijack Palin’s phrase — loving best that which is worst about America. And I do love to hate what is most deplorable about America. It’s Hathos, right? A fond contempt, a cordial loathing that is at once ironic yet profoundly earnest in that its cynicism is really just a guttering (but inextinguishable!) idealism that swallowed the utopian dream of democracy hook, line, and sinker. It recites William S. Burrough’s blackly comedic “Thanksgiving Prayer,” but has a secret weakness for the imagined community of Norman Rockwell’s Thanksgiving.
There’s a real recoil, in the New Sincerity as you call it, from the stylistic moves associated with this philosophical posture. I reject the presumption that we have to restrict our vocabularies to what would be immediately apprehensible to Kim Kardashian, or the idea that you’re striking at the heart of American democracy if you send your reader to the dictionary. It’s not the merest arrogance. It’s not elitism. It’s just the opposite: a confidence in the intelligence of reasonably educated Americans that expects them not to be affronted by a word they don’t already know or an idea they haven’t already encountered.
MG: Do you still believe there is a role for the cultural critic?
Yes, actually, Steven Johnson wrote this essay for The New York Times blog about the catchphrase “semiotics major from Brown” and how it has come to stand in for something — the flourishing practice of cultural criticism among young people, this percolating out of literary theory into hipster culture. You’ve got people reading The Awl for whom names like Lacan and Derrida and Zizek are at least passingly familiar. Whether or not they’ve read them, they’ve heard of them, have some inkling of what they stand for. That’s historically really unique. That absolutely wasn’t the case before the eighties, when Semiotext(e) started publishing those little Nintendo cartridge-sized books of Sylvere Lotringer’s translations of Baudrillard and Virilio that popularized Baudrillard and other continental philosophers among the New York art-istocracy, the hip-eoisie. So I think the hipsterization of theory, the seepage out into the cultural arena of concepts such as postmodernism, problematizes the idea of the cultural critic as a meta-explainer of culture, a mediator between taxonomies of taste and the demographics they imply, you know, highbrow versus lowbrow. There are a lot of really smart, culturally literate, theory fluent people diving into the comment threads on publications like yours who are doing a kind of cultural criticism in those threads and on their blogs and even on Facebook, god help us.
I applaud this development and the growing critical vocabulary and theoretical literacy of a certain bandwidth of the readership as well as this wonderful efflorescence of sites now that are interested in longform, sometimes even belletristic nonfiction and essay writing. So that’s heartening.
MG: There’s a fluency and allusive-density to your speech that’s also apparent in your writing. Is writing, for you, simply a matter of putting thought to page, or are the two — speech and writing — very different?
MD: Do I just translate my linguistic mind onto the page? Actually, it’s something far weirder: it’s the other way round. The way that I speak is the result of a childhood spent as a precocious weirdo who started reading at an early age. I don’t know if I was hyperlexic in the clinical sense, but I know I was reading the Carl Barks-era Disney’s Comics and Stories by the time I was four. Later, I read Sherlock Holmes and Tom Sawyer and Poe and very early on got it into my head, for whatever reason, that this was the way to talk, or — a telling distinction — the way aesthetes talked. So I suppose the way I speak is some Bizarro-World bricolage of “literary” syntax and vocabulary and locutions, some of which are Briticisms, some of which are Victorian, some of which scan as formal or highbrow.
It’s the bizarre hangover of reading the wrong things at too early an age, I suppose; looking at the world through the wrong end of the telescope. Rather than saying that prose style is best that most closely traces the conversational contours and verbal syncopations of everyday speech as spoken by the average American on the next barstool, I took away just exactly the opposite message, namely: that speech is best that sounds most like literary prose.
But as the guy in Spinal Tap says, “I’m just as God made me.” That’s just how my mind works. It’s this gene splice of crap culture, which for a Southern Californian white guy of my generation consists of Big Daddy Roth, “Odd Rods” bubble-gum trading cards, junk food like Bugles and Funyuns, Revell monster models, H.R. Pufnstuf and Saturday morning cartoons like Josie and the Pussycats, The Banana Splits, Lancelot Link Secret Chimp, prog-rock bands like Yes and ELP and Uriah Heep and Jethro Tull, Izod shirts, tatami flipflops, puka shells and the David Cassidy shag, the bubble-windowed van with airbrush art inspired by Frank Frazetta — I could go on and on. There’s all of that and then there’s everything I’ve just said about my idiosyncratic literary appetite as a kid. It’s that mash-up, I think, that makes my sensibility so polymorphously perverse.
MG: You’re currently writing a biography on Edward Gorey, who seems to share your mordant sensibility, and yet you seem, to me at least, foils of a sort. In an essay you wrote about him for The New York Times, you quote him as having said, “When people are finding meaning in things — beware,” which seems at odds with your aim, as a cultural critic. What in particular interests you about Gorey?
MD: We certainly share an idiosyncratic sensibility, but Gorey, unlike me, wasn’t mordant in the least. On the other hand, he wasn’t as anti-intellectual as the quote suggests. He had a public style that he used with interviewers that was winningly evasive, I guess I would call it. He was an incomparable master at parrying interviewer’s thrusts, responding to a probing question about his sexuality with an elliptical anecdote about his favorite Golden Girls episode or an unblinking, poker-faced insistence that William Shatner was one of our great thespians. In the correspondence between himself and Peter Neumeyer, you see an entirely different Gorey, one that could be unsparingly self-analytical, even painfully so, to the point where it’s clear that he suffered some dark nights of the soul. So he could be nakedly revealing at times, but his public persona was cagey when the conversation turned personal.
What interests me about Gorey, first of all, is that he was a screaming aesthete but he was also interested in philosophy, not really in contemporary theory mind you, but in Asian philosophy. He’d read Chomsky and books on linguistics and even gave Levi-Strauss a whirl, I think. In the Neumeyer letters he seems quite taken with Mircea Eliade, the legendary scholar of comparative religion. So he was no stranger to highbrow, brain-stretching philosophy and critical thought. Gorey is a labyrinthine figure; his mind is honeycombed with strange fascinations and obscure pockets of knowledge.
One of his signature phrases was “Oh, the … of it all,” which is an amazingly Derridean formulation. He leaves this lacuna in the middle of the sentence. It’s a very wry, poetical enactment of the inadequacy of language.
MG: So he’s italicizing that blank right where the most profound sentiment is suppose to sit?
MD: Exactly; he’s saying, in a sense, “What is the meaning of life? Oh, the [blank] of it all.” He’s attempting to step outside of language by letting this elegant silence stand in for what Derrida would call the “transcendental signified.” He’s saying, “I’m going to perform this epistemic trick because we know that language is inadequate to expressing the meaning of it all, so I’m going to use this blank, this emptiness, as a signifier of the ineffable — that profundity that cannot be ‘effed,’ so to speak.” In the Neumeyer correspondence, Gorey says, self-mockingly, “Here is the Ted Gorey Great Theory of Art,” which is that a work of art is always about what it doesn’t appear to be about and the best thing about it is what it isn’t about, which isn’t at all a restatement of the Hemingway adage that a work of literature draws its power, as an iceberg does, from the fact that ninety percent of what it’s saying is submerged, so you have this skeletalized prose that, through its elisions, gestures off-stage to this dark matter, this mass of what isn’t said explicitly in the narrative, but is nonetheless indicated in the narrative such that the reader is able to sense it on the level of his readerly unconscious. Gorey isn’t saying that. He’s saying something entirely different. A vulgarization of it may be that it’s the Freudian difference between the manifest and the latent content of a narrative; between text and subtext. But it’s much more mysterious than even that, perhaps much more Taoist than that. I can’t even pretend to fully grasp it; I’d have to delve much deeper into all of Gorey’s influences. The problem is that the guy was a brain-wrenching polymath. And there’s every bit of evidence to suggest that he was one of those vanishingly rare creatures, the true hyperlexic, someone who begins reading at a very, very early age — probably around age three, all the evidence suggests. And I don’t mean Pat the Bunny. By somewhere around six, he claimed, he was reading Henry James. He had certainly read Mary Shelley’s Frankenstein and Bram Stoker’s Dracula by that age; that’s adequately evidenced. He does concede, in a number of interviews, “I can’t imagine I understood James at that age,” but he did toil through some of his books, and then later went on to both read James’s entire corpus and then emerge an inveterate James-loather. He had an absolutely unalloyed detestation for James, but for whatever incomprehensible reason, felt the need to read him and, on occasion, even reread him!
MG: What was it about James that he didn’t like?
MD: He felt James was an over-explainer. And, as I said, Gorey is sort of Derridean in the sense that he’s profoundly convinced in his bones of the inadequacy of language, and even of art, even as he is simultaneously convinced of their ability to gesture toward their inadequacy in a way that communicates beyond the signifier and the signified into a sort of cosmos of dark matter where a meaning exists that is beyond meaning. Don’t ask me to explain, since I haven’t fully theorized this yet!
MG: As someone with a clear mastery of language, whose profession it is to examine and parse, do you share Gorey’s bred-in-bone doubts about language? In an interview for Accelerator, there’s a moment where you recoil from the idea that “there isn’t a word for it,” maintaining that if there isn’t a word, one should coin a word.
MD: Well, I’ve always been infuriated by people who say “I just don’t have words for it,” because they’re stammeringly telegraphing their own inadequacy, and I find that especially maddening in writers — this faltering hesitance which some may charitably frame as an insistence on linguistic exactitude, a tireless quest for le mot juste, and others may read as shorthand for Authenticity; I just see it as an ungenerosity, a miserly unwillingness to share one’s innermost thoughts or feelings. I mean, for fuck’s sake, as the English say, just spit it out, man. If you can’t find the right word for it, make one up. That’s why God gave us the neologism.
That, by the way, is one of the many great things about the American language, as Mencken called it: the felicity of the thing — the fact that it’s so conducive to the appropriation of le mot juste from other tongues. It’s the original mash-up, a great sort of engine of appropriation and recombination.
That said, I was being a little flip in that interview. The ineffable is well and truly real; the recovering Derridean in me readily acknowledges that language has its bounds. But at the same time poetic language dangles the hope that this self-referential system can stand on its own head. That’s why surrealism has been so important to me. I believe that the surrealist metaphor, the Lewis Carollian portmanteau, the conflation of two unlike things — most famously, Lautréamont’s chance meeting of an umbrella and a sewing machine on a dissecting table — opens the door to a linguistic gambit, a writerly strategy in which we are able to use language to semaphore beyond its boundaries. The signpost points past the false promise of the transcendental signified. We know that language is a closed-loop system that ultimately refers only to itself. Every undergraduate knows that any word you look up unpacks itself in terms of other words, which you can look up, which in turn unpack themselves in terms of other words. And so on, in an infinite regress. But we can also imagine rising above that hedge maze of the mind in a way that really does “hack” the fluorescent neon-green code of the matrix, allowing us to glimpse a reality outside of that, a world beyond language.
MG: Have you had one of those experiences where you’ve been able to "rise above the hedge maze," as you say? Is that where your fascination with language stems?
For me — meaning: me as a symbol-juggling human, a signifying monkey rather than an intellectual — there’s a primacy to language, because the writerly act and the speech act are epistemic and ontological keystones that for me go back to my very first memory. My first memory is of being three or four, or however old I was when I started speaking, sitting alone is an apartment room and saying aloud, “I am not me,” and realizing that the word machine in me, the machinic assemblage (as Deleuze would call it) that was producing speech was something apart from my consciousness of myself, and, curiouser and curiouser, even the self is a linguistic construct, a narrative that cognition tells itself, a reassuring fiction about Who You Are. Deleuze says, in so many words, “Who is the I that says ‘I’?” which is just… a wave of gooseflesh washes over me every time I think of that quote! So there I was, a toddler alone in a room, asking myself, “Who is the I that says I?” It was a moment of profound ontological rupture, because I realized, in some inchoate way, that language is a virus, as Burroughs claims — a parasitic xenomorph dwelling within — and that there is an inaccessible part of the self that is not the linguistic self. It was so jarringly bizarre, almost like an out-of-body experience, so dislocating that I felt, for a moment, that I couldn’t re-inhabit my body. I started thinking about the fact that when I looked in the mirror, the face that looked back at me wasn’t me, just a mask made of meat with two black-hole eyes, infinitely dense, endlessly collapsing; that there was a me that was me that was simply operating this meat puppet. That weirded me out so profoundly that I couldn’t think about any of those things for years.
So something that really, really fascinates me, and that I never stop thinking of, and return to time and again, is this idea of what cognition would look like outside of language. Thought only becomes visible when it’s cloaked in language; when it uncloaks, it etherealizes into something phantasmic. What shape does prelinguistic thought assume? If we traveled back to the arboreal, pre-human apes of the African Savanna and drilled a borehole into their internal monologues, I believe they would be unimaginably alien to us. What does the stream of consciousness of a creature that doesn’t have language look like?