A Critique of David Foster Wallace: Part One: OBLIVION / David Foster Wallace Is a Bad Writer / OBLIVION by David Foster Wallace

CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION!

A review of Oblivion (David Foster Wallace) by Joseph Suglia

When I was in graduate school, I was (mis)taught Literature by a man who had no ear for poetic language and who had absolutely no interest in eloquence.  I learned that he held an undergraduate degree in Physics and wondered, as he chattered on loudly and incessantly, why this strange man chose to study and teach Literature, a subject that obviously did not appeal to him very much.  I think the same thing of David Foster Wallace, a writer who probably would have been happier as a mathematician (Mathematics is a subject that Wallace studied at Amherst College).

A collection of fictions published in 2004, Oblivion reads very much as if a mathematician were trying his hand at literature after having surfeited himself with Thomas Pynchon and John Barth—not the best models to imitate or simulate, if you ask me.

The first fiction, “Mr. Squishy,” is by far the strongest.  A consulting-firm evaluates the responses of a focus-group to a Ho-Hoesque chocolate-confection.  Wallace comes up with some delightful phraseologies: The product is a “domed cylinder of flourless maltilol-flavored sponge cake covered entirely in 2.4mm of a high-lecithin chocolate frosting,” the center of which is “packed with what amounted to a sucrotic whipped lard” [6].  The external frosting’s “exposure to the air caused it to assume traditional icing’s hard-yet-deliquescent marzipan character” [Ibid.].  Written in a bureaucratized, mechanical language–this language, after all, is the dehumanized, anti-poetic language of corporate marketing firms, the object of Wallace’s satire–the text is a comparatively happy marriage of content and form.

Wallace gets himself into difficulty when he uses this same bureaucratic language in the next fiction, “The Soul is Not a Smithy,” which concerns a homicidal substitute-teacher.  I could see how a sterile, impersonal narrative could, by way of counterpoint, humanize the teacher, but the writing just left me cold.  The title of the fiction simply reverses Stephen Dedalus’s statement in A Portrait of the Artist as a Young Man: “I go to encounter for the millionth time the reality of experience and to forge in the smithy of my soul the uncreated conscience of my race.”

Wallace never composed a sentence as beautiful as Joyce’s.  Indeed, Wallace never composed a beautiful sentence.

“Philosophy and the Mirror of Nature” simply duplicates the title (!) of Richard Rorty’s misguided polemic against representationalism (the misconceived idea that language is capable of mirroring the essence of things).  It concerns a son who accompanies his mother to a cosmetic-surgery procedure.  The son, who is also the narrator, says: “[A]nyone observing the reality of life together since the second procedure would agree the reality is the other way around…” [183].  The narrator might or might not be one of the deluded representationalists against whom Rorty polemicized.  For Rorty, “the reality of life” is not something that we are capable of talking about with any degree of insight.  Unfortunately, this is the only point in the text at which the philosophical problem of representation arises.

The eponymous fiction “Oblivion” and the self-reflexive “The Suffering Channel” (which concerns a man whose excreta are considered works of art) are inelegantly and ineloquently written.

After laboring through such verbal dross, I can only conclude that David Foster Wallace was afraid of being read and thus attempted to bore his readers to a teary death.  His noli me legere also applies to himself.  It is impossible to escape the impression that he was afraid of reading and revising any of the festering sentences that he churned out.  Inasmuch as he likely never read his own sentences, he likely never knew how awkward they sounded.  Infinite Jest was written hastily and unreflectively, without serious editing or revision, it appears.  It is merely because of the boggling bigness of Infinite Jest that the book has surfaced in the consciousness of mainstream America at all (hipsterism is a vicissitude of mainstream America).  We, the Americanized, are fascinated by bigness.  To quote Erich Fromm: “The world is one great object for our appetite, a big apple, a big bottle, a big breast; we are the sucklers…”

Speech is irreversible; writing is reversible.  If you accept this premise of my argument (and any intelligent person would), must it not be said that responsible writers ought ALWAYS to recite and revise their own sentences?  And does it EVER seem that Wallace did so?

The prose of Oblivion is blearily, drearily, eye-wateringly tedious.  The hipsters will, of course, claim in advance that the grueling, hellish tedium of Wallace’s prose was carefully choreographed, that every infelicity was intentional, and thus obviate any possible criticism of their deity, a deity who, like all deities, has grown more powerful in death.  That is, after all, precisely what they say of the Three Jonathans, the sacred triptych of hipsterdom: Foer, Franzen, and Lethem, the most lethal of them all.

One thing that even the hipsters cannot contest: David Foster Wallace did not write fictionally for his own pleasure.  Unlike Kafka, he certainly did not write books that he ever wanted to read.

A valediction: The early death of David Foster Wallace is terrible and should be mourned.  He was a coruscatingly intelligent man.  My intention here is not to defame the dead.  Since I am a literary critic, I must recommend that the reader spend time with better books and leave his writings alone.  As I suggested above, he probably didn’t want his prose to be read, anyway.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

Three Aperçus: On DEADPOOL (2016), David Foster Wallace, and Beauty

CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION!

Three Aperçus: On DEADPOOL (2016), David Foster Wallace, and Beauty

by Joseph Suglia

Deadpool (2016) is capitalism with a smirking face.

David Foster Wallace was not even a bad writer.

Beauty is the one sin that the Average American of today cannot forgive.

*

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

Two Aperçus: THE NEON DEMON (2016)

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION.

Two aperçus

The Neon Demon (2016) is a snuff film in which art is murdered.

Descent (2007) is superior to The Neon Demon because the former has an Aristotelian structure–which works.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

 

HOUSE OF LEAVES by Mark Z. Danielewski / WHEN DID WRITING STOP HAVING TO DO WITH WRITING? – by Dr. Joseph Suglia

CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION, IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE.

WHEN DID WRITING STOP HAVING TO DO WITH WRITING?
by Dr. Joseph Suglia

When did writing stop having to do with writing?  Of the many attempts to communalize literature, none is more dangerous than the sway of the current ideology: the consensus, and consciousness, that writing has nothing to do with writing.  You will hear readers talk about “plot” (in other words, life).  You will hear them talk about the “author.”  But writing?  Writing has nothing to do with writing.  No one cares whether a book is well-written anymore.

* * * * *

Mark Z. Danielewski is not very much interested in language.  He cares more about graphics than he does about glyphs.  No words live in his House of Leaves.  It is a house of pictures, not of words.  It is a house in which words only exist as blocks of physical imagery.

Allow me to cite a few not unrepresentative sentences/fragments from House of Leaves:

1.) “A hooker in silver slippers quickened by me” [296].  Danielewski, scholar, thinks that “to quicken” means “to move quickly.”

2.) “Regrettably, Tom fails to stop at a sip” [320].  I convulse in agony as I read this sentence.

3.) “Pretentious,” too often, is American for “intelligent.”  It is a word that is often misapplied.  However, in the case of House of Leaves, it must be said that Danielewski uses German pretentiously.  In a book that is littered with scraps of the German language, shouldn’t that language be used properly?  “der absoluten Zerissenheit” [sic; 404 and elsewhere — a Heideggerean citation] should read “die absolute Zerissenheit“–the genitive is never earned.  “unheimliche vorklaenger” [sic; 387] should read “unheimliche Vorklänge” and does not mean “ghostly anticipation.”  Whenever Danielewski quotes the German, he is being pretentious–that is, he is pretending to know things of which he knows nothing.

It is impossible to escape the impression that Mark Z. Danielewski does not want to be read.  Noli me legere = “Do not read me.”  The House of Leaves is a book at which to be looked, not one that is to be read.  Its sprawling typographies and fonts distract the reader from the impoverished prose.

Words are reduced to images, to pictures.

* * * * *

When did writing stop having to do with writing?  When novels became precursors to screenplays.  The terminus ad quem is 1963, with the publication of Charles Webb’s The Graduate.  The novel is a proto-screenplay, as was Ira Levin’s Rosemary’s Baby, published in 1967.  The film studio (William Castle Enterprises) optioned the novel even before Levin finished writing it!  Astoundingly, Rosemary’s Baby, according to my interpretation, is a novel about the diabolical essence of the Hollywood entertainment industry!

With the rise of mainstream cinema came the denigration of literature.  The visual overthrew the verbal.  Around the same time, imaginative prose began to be dumbed well down.  There are two infantile reductions at work, both of which are visible in House of Leaves: a dumbing-down of language and an accent on the optical (as opposed to the verbal).

Such infantile reductions are everywhere in evidence whenever one picks up a contemporary American novel.  We can thank America for the coronation of the idiot and for an all-embracing literary conformism.  Even stronger writers, these days, morosely submit to the prevailing consolidation of a single “literary style.”  A style that, of course, is no style at all.  And these same writers, listlessly and lifelessly, affirm in reciprocal agreement that the construction of a well-wrought sentence isn’t something worth spending time on.  Or blood.

How self-complacent American writers have become!  The same country that produced Herman Melville, William Faulkner, and Saul Bellow has given birth to Mark Z. Danielewski.  Nothing is more hostile to art than a culture of complacency.

There was, I’m sure, something very refreshing about Charles Bukowski in the 1970s, when the vestiges of a literary academism still existed.  Mr. Bukowski, I am assuming, would be dismayed to uncover the kindergarten of illiterate “literati” to which he has illegitimately given birth.  His dauphin, Mark Z. Danielewski.

Weaker students of literature might feel invigorated by the Church of Literary Infantilism, yet even they know that the clergy engenders nothing sacred or profane.  This explains their virulent defensiveness when anyone, such as myself, dares to write well or explore another writer’s engagement with language.  “Writing doesn’t matter,” you see.  They have never luxuriated in the waters of language; they have never inhabited a world of words.  Words don’t interest them; people do.  And literary discussions have degenerated to the level of a bluestockinged Tupperware party.  If you like the main character, the book is “good.”  If a book is warm and friendly, that book is “good.”  If a book reassures you that you are not a slavering imbecile–that is to say, if you can write better than the book’s “author”–that book is “good.”  If a book disquiets you or provokes any kind of thought whatsoever, that book is “bad.”  If a book has an unsympathetic main character, that book is “bad.”  If a book is difficult to understand, that book is “bad,” and so forth and so on.  Whatever exceeds the low, low, low standards of the average readership, in a word, is blithely dismissed as “bad.”

Things grow even more frightening when we consider the following: These unlettered readers are quickly transforming into writers.  That would be fine if they knew how to write.  And if the movements of language were valued, culturally and humanly, their noxious spewings would find no foothold.  The literature of challenge has been supplanted by the litter of the mob, with all of its mumbling solecisms and false enchantments.  The problem with mobs, let us remind ourselves, is that they efface distinctions.  They do everything in their power to make the distinguished undistinguished.  And so instead of James Joyce, we have bar-brawling beefheads (e.g. Chuck Palahniuk), simian troglodytes (e.g. Henry Rollins), and graphic designers / typographists (e.g. Mark Z. Danielewski).

Instead of poeticisms, we have grunts.  We have pictures.  We have graphic design and cinema.

* * * * *

Someone said to me: “I am a good writer, but I don’t know how to spell.”

Someone said to me: “No writer is better than any other.”

* * * * *

America is responsible for the production of more linguistic pig-shit than any other country in the world.  There is absolutely nothing surprising about this statement.  After all, America is the only country that celebrates stupidity as a virtue.  How could things be otherwise?

At the poisonous end of the democratization process, which is indistinguishable from the process of vulgarization, every jackass on the street sees himself as an “author.”  His brother, his grandmother, and his step-uncle: they, too, regard themselves as “authors.”  After all, they think–inasmuch as they are capable of thinking–“Writing has nothing to do with writing.  If Mark Z. Danielewski can be published, so can I!”  (Yes, their desire is “to be published,” as if their lives would be inscribed on the page, disseminated, filmed, and thus rendered meaningful.)  We live in an age of all-englobing and infinitely multiplying cyber-technologies, where stammering imbeciles mass-replicate their infantile scribbles, but let us not deceive ourselves: If a “writer” is simply one who writes, then they are writers; however, one should reserve the word “author” only for those who are profoundly committed to the craft of verbal composition.

* * * * *

Judging from a purely technical point of view, House of Leaves is consistently faulty, fraught with excruciating Hallmark banalities and galling linguistic errors.  Hipster Mark Z. Danielewski is seemingly incapable of composing a single striking or insightful sentence.  It astonishes me that anyone ever considered his tinker-toy bromides to be publishable.  The House of Leaves is a house that is neither well-appointed nor ill-appointed.  It is simply not appointed at all.

* * * * *

Who cares about language anymore?  No one in America even questions the assumption that good writing does not matter.  And this assumption is no longer limited to America–a horrific logophobia is spreading throughout the globe.  The impetuses that motivate this tsunami of “literary” vomit are the following ideological assumptions: The fallacy that 1.) everyone is entitled to be an author (this is a particularly nasty perversion of the democratic principle) and that 2.) the visible improves on the verbal.  American letters have been reduced to the gibbering and jabbering of semiliterate simpletons, driveling half-wits, and slack-jawed middlebrows.  It’s only a matter of time before the English stop caring about language, as well.

When you live in a culture of complacency, a culture of appeasement, a hypocritical culture that assures you that you write well even if you don’t, there is only one way out.  There is nothing for the strong and serious student of literature to do but to write for himself, to write for herself, for his or her own sake.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

Analogy Blindness: I invented a linguistic term. Dr. Joseph Suglia

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION.

ANALOGY BLINDNESS by Joseph Suglia

Over the years, I have invented a number of words and phrases.  Genocide pornography is one that I am especially proud of (cf. my essays on Quentin Tarantino); anthropophagophobia is another word that I coined, which means “the fear of cannibalism” (cf. my interpretation of Shakespeare’s As You Like It).  I would like to introduce to the world (also known as Google) a new linguistic term:

analogy blindness (noun phrase): the inability to perceive what an analogy represents.  To be lost in the figure of an analogy itself, while losing sight of the concept that the analogy describes.

EXAMPLE A

The Analogist: Polygamy is like going to a buffet instead of a single-serve restaurant.  Both are inadvisable.

The Person Who Is Blind to the Analogy: People love buffets!

EXAMPLE B

The Analogist: Being taught how to write by Chuck Palahniuk is like being taught how to play football by a one-legged man.

The Person Who Is Blind to the Analogy: A one-legged man who knows how to coach football?  That’s great!

EXAMPLE C

The Analogist: You should not have reprimanded her in such a rude manner for taking time off from work.  You treated her as if she were guilty of some terrible offense, such as plagiarism.

The Person Who Is Blind to the Analogy: But plagiarism is bad!

EXAMPLE D

Derived from Hui-neng: When the wise person points at the Moon, the imbecile sees the finger.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

David Foster Wallace and Macaulay Culkin: Two aperçus

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION.

David Foster Wallace and Macaulay Culkin: Two aperçus

David Foster Wallace was a sudorific pseudo-author.

Macaulay Culkin only holds one thing in common with the young Lou Reed: a heroin addiction.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

A commentary on HUMAN, ALL-TOO-HUMAN by Nietzsche / MENSCHLICHES, ALLZUMENSCHLICHES: Nietzsche and Sam Harris / Nietzsche on Women / Was Nietzsche a sexist? / Was Nietzsche a misogynist? / Nietzsche and Sexism / Sam Harris and Nietzsche / Sexism and Nietzsche / Misogyny and Nietzsche / Nietzsche and Misogyny / Nietzsche and Sexism / Nietzsche and Feminism / Feminism and Nietzsche / Friedrich Nietzsche on Women / Friedrich Nietzsche and Sam Harris / Is Sam Harris Influenced by Nietzsche?

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION!

HUMAN, ALL-TOO-HUMAN / MENSCHLICHES, ALLZUMENSCHLICHES (Friedrich Nietzsche)

A commentary by Joseph Suglia

MAM = Menschliches, Allzumenschliches. Ein Buch für freie Geister (1878); second edition: 1886

VMS = Vermischte Meinungen und Sprüche (1879)

WS = Der Wanderer und sein Schatten (1880)

The following will not have been an interpretation of Nietzsche’s Human, All-Too-Human.  It will have been a commentary: Comment taire? as the French say.  “How to silence?”  In other words: How should the commentator silence his or her own voice and invisibilize his or her own presence in order to amplify the sound of the text and magnify the text’s image?

An interpretation replaces one meaning with another, or, as Heidegger would say, regards one thing as another.  A commentary adds almost nothing to the text under consideration.

Nietzsche’s Psychological Reductionism and Perspectivalism

Human, All-Too-Human is almost unremittingly destructive.  For the most part, it only has a negative purpose: to demolish structures and systems of thought.  However, there is also a positive doctrine within these pages, and that is the doctrine of total irresponsibility and necessity (to which I will return below) and the promise of a future humanity that will be unencumbered by religion, morality, and metaphysics.

In the preface of the second edition (1886), Nietzsche makes this thrust and tenor of his book clear with the following words: The purpose of the book is “the inversion of customary valuations and valued customs” (die Umkehrung gewohnter Wertschätzungen und geschätzter Gewohnheiten).  The highest ideals are reduced to the basest human-all-too-humanness of human beings.  This is a form of psychological reductionism: Once-good values (love, fidelity, patriotism, motherliness) are deposed.  The man who mourns his dead child is an actor on an imaginary stage who performs the act of mourning in order to stir up the emotions of his spectators—he is vain, not selflessly moral.  The faithful girl wants to be cheated upon in order to prove her fidelity—she is egoistic, not selflessly moral.  The soldier wants to die on the battlefield in order to prove his patriotism—he is egoistic, not selflessly moral.  The mother gives up sleep to prove her virtuous motherliness—she is egoistic, not selflessly moral [MAM: 57].

The inversion of valuations leads to an advocacy of the worst values: vanity and egoism (but never the vaingloriousness of arrogance, against which Nietzsche warns us for purely tactical reasons).  As well as lying.  Nietzsche praises lying at the expense of the truth to the point at which lying becomes the truth, and the truth becomes a lie that pretends that it is true.  This, of course, is a paradox, for anyone who says, “There is no truth, only interpretations of truth” is assuming that one’s own statement is true.

Again and again, Nietzsche phenomenalizes the world.  Appearance (Schein) becomes being (Sein): The hypocrite is seduced by his own voice into believing the things that he says.  The priest who begins his priesthood as a hypocrite, more or less, will eventually turn into a pious man, without any affectation [MAM: 52].  The thing in itself is a phenomenon.  Everything is appearance.  There is no beyond-the-world; there is nothing outside of the world, no beyond on the other side of the world, no επέκεινα.

As far as egoism is concerned: Nietzsche tells us again and again: All human beings are self-directed.  I could have just as easily written, All human beings are selfish, but one must be careful.  Nietzsche does not believe in a hypostatized self.  Every individual, Nietzsche instructs us, is a dividual (divided against himself or herself), and the Nietzsche of Also Sprach Zarathustra (1883-1885) utterly repudiates the idea of a substantialized self.  To put it another way: No one acts purely for the benefit of another human being, for how could the first human being do anything without reference to himself or herself?: Nie hat ein Mensch Etwas gethan, das allein für Andere und ohne jeden persönlichen Begweggrund gethan wäre; ja wie sollte er Etwas thun können, das ohne Bezug zu ihm wäre? [MAM: 133].  Only a god would be purely other-directed.  Lichtenberg and La Rochefoucauld are Nietzsche’s constant points of reference in this regard.  Nietzsche never quotes this Rochefoucauldian apothegm, but he might as well have:

“True love is like a ghost which many have talked about, but few have seen.”

Or:

“Jealousy contains much more self-love than love.”

Whatever is considered “good” is relativized.  We are taught that the Good is continuous with the Evil, that both Good and Evil belong to the same continuum.  Indeed, there are no opposites, only degrees, gradations, shades, differentiations.  Opposites exist only in metaphysics, not in life, which means that every opposition is a false opposition.  When the free spirit recognizes the artificiality of all oppositions, s/he undergoes the “great liberation” (grosse Loslösung)—a tearing-away from all that is traditionally revered—and “perhaps turns [his or her] favor toward what previously had a bad reputation” (vielleicht nun seine Gunst dem zugewendet, was bisher in schlechtem Rufe stand) [Preface to the second edition].  The awareness that life cannot be divided into oppositions leads to an unhappy aloneness and a lone unhappiness, which can only be alleviated by the invention of other free spirits.

What is a “free spirit”?  A free spirit is someone who does not think in the categories of Either/Or, someone who does not think in the categories of Pro and Contra, but sees more than one side to every argument.  A free spirit does not merely see two sides to an argument, but rather as many sides as possible, an ever-multiplying multiplicity of sides.  As a result, free spirits no longer languish in the manacles of love and hatred; they live without Yes, without No.  They no longer trouble themselves over things that have nothing to do with them; they have to do with things that no longer trouble them.  They are mistresses and masters of every Pro and every Contra, every For and every Against.

All over the internet, you will find opposing camps: feminists and anti-feminists, those who defend religious faith and those who revile religious faith, liberals and conservatives.  Nietzsche would claim that each one of these camps is founded upon the presupposition of an error.  And here Nietzsche is unexpectedly close to Hegel: I am thinking of Nietzsche’s perspectivalism, which is, surprisingly, closer to the Hegelian dialectic than most Nietzscheans and Hegelians would admit, since they themselves tend to be one-sided.  In all disputes, the free spirit sees each perspective as unjust because one-sided.  Instead of choosing a single hand, the free spirit considers both what is on the one hand and what is on the other (einerseits—andererseits) [MAM: 292].  The free spirit hovers over all perspectives, valuations, evaluations, morals, customs, and laws: ihm muss als der wünschenswertheste Zustand jenes freie, furchtlose Schweben über Menschen, Sitten, Gesetzen und den herkömmlichen Schätzungen der Dinge genügen [MAM: 34].  It is invidiously simplistic and simplistically invidious to freeze any particular perspective.  Worse, it is anti-life, for life is conditioned by perspective and its injustices: das Leben selbst [ist] bedingt durch das Perspektivische und seine Ungerechtigkeit [Preface to the second edition].  A free spirit never takes one side or another, for that would reduce the problem in question to the simplicity of a fixed opposition, but instead does justice to the many-sidedness of every problem and thus does honor to the multifariousness of life.

There Is No Free Will.  Sam Harris’s Unspoken Indebtedness to Nietzsche.

Let me pause over three revolutions in the history of Western thought.

The cosmological revolution known as the “Copernican Revolution” marked a shift from the conception of a cosmos in which the Earth is the center to the conception of a system in which the Sun is the center.  A movement from geocentrism (and anthropocentrism) to heliocentrism.

The biological revolution took the shape of the theory of evolution (“It’s only a theory!” exclaim the unintelligent designers), which describes the adaptation of organisms to their environments through the process of non-random natural selection.

There is a third revolution, and it occurred in psychology.  I am not alluding to psychoanalysis, but rather to the revolution that predated psychoanalysis and made it possible (Freud was an admirer of Nietzsche).  Without the Nietzschean revolution, psychoanalysis would be unthinkable, and Twitter philosopher Sam Harris’s Free Will (2012) would never have existed.

I am alluding to the revolution that Nietzsche effected in 1878.  It was a silent revolution.  Almost no one seems aware that this revolution ever took place.

It is a revolution that describes the turning-away from voluntarism (the theory of free will) and the turning-toward determinism, and Nietzsche’s determinism will condition his critique of morality.  Nietzschean determinism is the doctrine of total irresponsibility and necessity.

[Let it be clear that I know that Spinoza, Hume, Hobbes, Schopenhauer, et al., wrote against the concept of the free will before Nietzsche.]

The free will is the idea that we have control over our own thoughts, moods, feelings, and actions.  It conceives of the mind as transparent to itself: We are aware in advance of why we do-say-write-think the things that we do-say-write-think.  This idea is false: You no more know what your next thought will be than you know what the next sentence of this commentary will be (if this is your first time reading this text).  It is only after the fact that we assign free will to the sources of actions, words, and thoughts.  Our thoughts, moods, and feelings—e.g. anger, desire, affection, envy—appear to us as isolated mental states, without reference to previous or subsequent thoughts, moods, and feelings: This is the origin of the misinterpretation of the human mind known as “the free will” (the definite article the even suggests that there is only one).  The free will is an illusion of which we would do well to disabuse ourselves.

We do not think our thoughts.  Our thoughts appear to us.  They come to the surfaces of our consciousness from the abysms of the unconscious mind.  Close your eyes, and focus on the surfacings and submersions of your own thoughts, and you will see what I mean.

This simple exercise of self-observation suffices to disprove the illusion of voluntarism.  If your mind is babbling, this very fact of consciousness refutes the idea of free will.  Mental babble invalidates the voluntarist hypothesis.  Does anyone truly believe that s/he wills babble into existence?  Does anyone deliberately choose the wrong word to say or the wrong action to perform?  If free will existed, infelicity would not exist at all or would exist less.  After all, what would free will be if not the thinking that maps out what one will have thought-done-said-written—before actually having thought one’s thought / done one’s deed / said one’s words / written one’s words?

Belief in free will provokes hatred, malice, guilt, regret, and the desire for vengeance.  After all, if someone chooses to behave in a hateful way, that person deserves to be hated.  Anyone who dispenses with the theory of the free will hates less and loves less.  No more desire for revenge, no more enmity.  No more guilt, no more regret.  No more rewards for impressive people who perform impressive acts, for rewarding implies that the rewarded could have acted differently than s/he did.  In a culture that accepted the doctrine of total irresponsibility, there would be neither heroes nor villains.  There would be no reason to heroize taxi drivers who return forgotten wallets and purses to their clients, nor would there be any reason to heroize oneself, since what a person does is not his choice / is not her choice.  No one would be praised, nor would anyone praise oneself.  No one would condemn others, nor would anyone condemn oneself.  Researchers would investigate the origins of human behavior, but would not punish, for the sources of all human thought and therefore the sources of all human behavior are beyond one’s conscious control / beyond the reach of consciousness.  It makes no sense to say / write that someone is “good” or “evil,” if goodness and evilness are not the products of a free will.  There is no absolute goodness or absolute evilness; nothing is good as such or evil as such.  There is neither voluntary goodness nor voluntary evilness.

If there is no free will, there is no human responsibility, either.  The second presupposes the first.  Do you call a monster “evil”?  A monster cannot be evil if it is not responsible for what it does.  Do we call earthquakes “evil”?  Do we call global warming “evil”?  Natural phenomena are exempt from morality, as are non-human animals.  We do not call natural phenomena “immoral”; we consider human beings “immoral” because we falsely assume the existence of a free will.  We feel guilt / regret for our “immoral” actions / thoughts, not because we are free, but because we falsely believe ourselves to be free: [W]eil sich der Mensch für frei halt, nicht aber weil er frei ist, empfindet er Reue und Gewissensbisse [MAM 39].  No one chooses to have Asperger syndrome or Borderline Personality Disorder.  Why, then, should someone who is afflicted with Asperger syndrome or Borderline Personality Disorder be termed “evil”?  No one chooses one’s genetic constitution.  You are no more responsible for the emergence of your thoughts and your actions than you are responsible for your circulatory system or for the sensation of hunger.

Those who would like to adumbrate Nietzsche’s “mature” thought should begin with Human, All-Too-Human (1878), not with Daybreak (1801).  Nietzsche’s critique of morality makes no sense whatsoever without an understanding of his deeper critique of voluntarism (the doctrine of free will): Again, the ideas of Good and Evil only make sense on the assumption of the existence of free will.

Anyone who dispenses with the idea of free will endorses a shift from a system of punishment to a system of deterrence (Abschreckung).  A system of deterrence would restrain and contain criminals so that someone would not behave badly, not because someone has behaved badly.  As Nietzsche reminds us, every human act is a concrescence of forces from the past: one’s parents, one’s teachers, one’s environment, one’s genetic constitution.  It makes no sense, then, to believe that any individual is responsible for what he or she does.  All human activity is motivated by physiology and the unconscious mind, not by Good or Evil.  Everything is necessary, and it might even be possible to precalculate all human activity, through the mechanics of artificial intelligence, to steal a march on every advance: Alles ist notwendig, jede Bewegung mathematisch auszurechnen… Die Täuschung des Handelnden über sich, die Annahme des freien Willens, gehört mit hinein in diesen auszurechnenden Mechanismus [MAM: 106].

If you accept the cruelty of necessity (and is life not cruel, if we have no say in what we think and what we do?), the nobility of humanity falls away (the letter of nobility, the Adelsbrief) [MAM: 107].  All human distinction is devalued, since it is predetermined—since it is necessary.  Human beings would finally recognize themselves within nature, not outside of nature, as animals among other animals.  I must cite this passage in English translation, one which is not irrelevant to this context and one which belongs to the most powerful writing I have ever read, alongside Macbeth’s soliloquy upon learning of his wife’s death: “The ant in the forest perhaps imagines just as strongly that it is the goal and purpose for the existence of the forest as we do, when we in our imagination tie the downfall of humanity almost involuntarily to the downfall of the Earth: Indeed, we are still modest if we stop there and do not arrange a general twilight of the world and of the gods (eine allgemeine Welt- and Götterdämmerung) for the funeral rites of the final human (zur Leichenfeier des letzten Menschen).  The most dispassionate astronomer can oneself scarcely feel the lifeless Earth in any other way than as the gleaming and floating gravesite of humanity” [WS: 14].

The demystification of the theory of free will has been re-presented by Sam Harris, who might seem like the Prophet of the Doctrine of Necessity.  Those who have never read Nietzsche might believe that Dr. Harris is the first person to say these things, since Dr. Harris never credits Nietzsche’s theory of total human irresponsibility.  If you visit Dr. Harris’s Web site, you will discover a few English translations of Nietzsche on his Recommended Reading List.  We know that Dr. Harris’s first book (unpublished) was a novel in which Nietzsche is a character.  We also know that Dr. Harris was a student of Philosophy at Stanford University.  He would therefore not have been unaware of the Nietzschean resonances in his own text Free Will.  Why, then, has Dr. Harris never publically acknowledged his indebtedness to Nietzschean determinism?

Nietzsche Is / Is Not (Always) a Misogynist.

In 1882, Nietzsche was sexually rejected by Lou Andreas-Salome, a Russian intellectual, writer, and eventual psychoanalyst who was found spellbinding by seemingly every cerebral man she met, including Rilke and Paul Ree.  Since the first edition of Human, All-Too-Human was published four years before, Salome’s rejection of Nietzsche cannot be said to have had an impact on his reflections on women at that stage in the evolution of his thinking.

Nietzsche is sometimes a misogynist.  But I must emphasize: He is not always a misogynist.

At times, Nietzsche praises women / is a philogynist.  To give evidence of Nietzsche’s philogyny, all one needs to do is cite Paragraph 377 of the first volume: “The perfect woman is a higher type of human being than the perfect man” (Das volkommene Weib ist ein höherer Typus des Menschen, als der volkommene Mann).  Elsewhere, Nietzsche extols the intelligence of women: Women have the faculty of understanding (Verstand), he writes, whereas men have mind (Gemüth) and passion (Leidenschaft) [MAM: 411].  The loftier term Verstand points to the superiority of women over men.  Here, Nietzsche is far from misogynistic—indeed, he almost seems gynocratic.

Nor is Nietzsche a misogynist, despite appearances, in the following passage—one in which he claims that women tolerate thought-directions that are logically in contradiction with one another: Widersprüche in weiblichen Köpfen.—Weil die Weiber so viel mehr persönlich als sachlich sind, vertragen sich in ihrem Gedankenkreise Richtungen, die logisch mit einander in Widerspruch sind: sie pflegen sich eben für die Vertreter dieser Richtungen der Reihe nach zu begeistern und nehmen deren Systeme in Bausch und Bogen an; doch so, dass überall dort eine todte Stelle entsteht, wo eine neue Persönlichkeit später das Übergewicht bekommt [MAM: 419].

To paraphrase: Nietzsche is saying that the minds of women are fluxuous and not in any pejorative sense.  He means that multiple positions coexist simultaneously in the consciousnesses of women.  Personalities are formed and then evacuate themselves, leaving dead spots (todte Stellen), where new personalities are activated.  This does not mean that the minds of women contain “dead spots”—it means that they are able to form and reform new personalities, which is a strength, not a weakness.  And yet does he not say the same thing about his invisible friends, the free spirits?  Free spirits are also in a state of constant flux, and their fluxuousness, while necessarily unjust to their own opinions, allows them to move from opinion to opinion with alacrity and to hold in their heads multiple opinions at the same time.  Free spirits have opinions and arguments, but no convictions, for convictions are petrific.  Free spirits are guiltless betrayers of their own opinions [MAM: 637] and goalless wanderers from opinion to opinion [MAM: 638].

Why would the substitution-of-one-position-for-another, intellectual inconstancy, be considered as something negative?  Is it not a trait of the free spirit the ability to substitute a new position for an older one with alacrity?  And is the free spirit not Nietzsche’s ideal human being—at least before the overhuman takes the stage?  Such is my main argument: Free-spiritedness is womanliness, and free spirits are womanly, if we accept Nietzsche’s definitions of “free-spiritedness” and of “womanliness.”

This is not to deny the strain of misogyny that runs throughout Nietzsche’s collected writings.  Yes, Nietzsche does write unkind and unjustifiable things about women—some of his statements about women are downright horrible and indefensible.  My objective here is to highlight the polysemy and polyvocality of his writing, its ambiguity.  For a further discussion of Nietzsche’s ambiguous representations of the feminine, consult Derrida’s Spurs, wherein he analyzes the figure of the veil in Beyond Good and Evil.

To say or write that Nietzsche is always a misogynist would be to disambiguate his work—if by “Nietzsche” one is referring to the paper Nietzsche.  (For a series of accounts of Nietzsche as a human being, see Conversations with Nietzsche: A Life in the Words of His Contemporaries, published by Oxford University Press.)  Nonetheless, let us pause over the historical, living human being Friedrich Nietzsche, who was male, and his relation to one historical, living human being, who was female: Marie Baumgartner, the mother of one of Nietzsche’s students and his sometime French translator.  In the original manuscript of Mixed Opinions and Maxims, the first appendix to Human, All-Too-Human, Nietzsche wrote: “Whether we have a serpent’s tooth or not is something that we do not know until someone has put his heel upon us.  Our character is determined even more by the lack of certain experiences than by what we have experienced” [VMS: 36].  In a letter to Nietzsche dated 13 November 1878, Marie Baumgartner wrote: “I would gladly have added to your very striking maxim: ‘a woman or mother would say, until someone puts his heel upon her darling or her child.’  For a woman will not silently allow something to happen to them that in most cases she patiently accepts for herself.”  Nietzsche was so affected by Baumgartner’s rather delicately worded suggestion that he modulated the text to reflect her proposal.  If Nietzsche regarded women as inferior (and he never did), why would he take seriously something that a female reader wrote about his manuscript—so seriously that he modified his manuscript to incorporate her words?  The fact that Nietzsche reflected Marie Baumgartner’s suggestion in the revision of his manuscript is evidence enough that he respected the intelligence of this particular woman—the grain of his own writing confirms that he respected the intelligence of women in general and even considered women in general to be more intelligent than men in general.

Nietzsche Was Not an Atheist, if by “Atheist” One Means “Someone Who Does Not Believe in God.”

Nietzsche tells us, in Paragraph Nine of the first volume, “Even if a metaphysical world did exist, it would be nothing other than an otherness [Anderssein] that would be unavailable and incomprehensible to us; it would be a thing with [purely] negative characteristics.”

My question (which has been inspired by Nietzsche) is the following: Why do we even care about the beyond?  Should questions such as “Is there life after death?” not be greeted with apathy?  Why are we engaged with such questions to begin with?  Do not such questions merit indifference rather than seriousness?

Questions such as “Does God exist?” and “Is there life after death?” cannot be answered scientifically or logically.  We do not require their answers in order to live.  All of us live out our lives without knowing the answers to such questions.  Not merely that: It is entirely possible to live out our lives without ever ASKING or PURSUING such questions—and would we not be better off for not having done so?

Let me put it another way: Do the questions “Why does the world exist?” and “Why is there being rather than nothing?” not presuppose a reason for existing and a reason for being?  I am looking at you, Heidegger.

The Nietzsche of 1878 is not an atheist, if by “atheist” one means “someone who does not believe in God.”  Those who contest the existence of a deity or deities are practicing a form of skiamachy.  Nietzsche, on the other hand, is someone who considers questions about the existence of God, or of any extra-worldly transcendence, to be superfluous.  Otherworldliness is not something that can be discussed, since it is purely negative.

Moreover, the Nietzsche of Human, All-Too-Human is not merely not an atheist.  He is also not a philosopher, if by “philosopher,” we mean someone who speculates about imaginary worlds / is an imaginary world-builder.  Nietzsche will not become a philosopher, speculative or otherwise, until the very end of his period of lucidity, with the doctrines of the Eternal Recurrence of the Always-Same and the Will to Power.

Nietzsche Contradicts Himself.  Often.  But This Is Not a Flaw in His Thinking.

Nietzsche contradicts himself—often—but this is not a flaw in this thinking.  He tells us to stop using the word “optimism” [MAM: 28] and then uses the word himself, without any perceptible irony, in other sections of the book.  After scolding us for believing in heroes, he warmly sponsors the “refined heroism” (verfeinerten Heroismus) of the free spirit who works in a small office and passes quietly into and out of life [MAM: 291].  In Paragraph 148 of the first volume, Nietzsche claims that the poet alleviates (erleichtert) life—this seems to contradict his claim, five paragraphs later, that “art aggravates the heart of the poet” (Die Kunst macht dem Denker das Herz schwer), that listening to Beethoven’s Ninth Symphony infuses the listener with the heavy feeling of immortality, with religious and metaphysical conceptions.  If Nietzsche contradicts himself, and he does, this is because free-spiritedness is multitudinous, multi-perspectival, self-contradictory thinking.  Free-spiritedness is multi-spiritedness.

Aphorisms Inspired by Nietzsche

On Religion and Politics

What is religious is political, and what is political is religious.

On Morality

Morality depends on opportunity.

On Communication

A word means something different to you than it does to me, which means that communication is impossible: Nothing is communicable save the power to communicate the impossibility of communication.  (Nietzsche suggests that the worst alienation is when two people fail to understand each other’s irony.)  Consciousness of this fact would liberate us from the bitterness and intensity of every sensation.

On Interpretation

The mind is geared not toward what has been interpreted, but toward that which has not been interpreted and might not even be interpretable.  Nietzsche: “We take something that is unexplained and obscure to be more important than something that has been explained and made clear” [MAM: 532].

On the Voice

We often disagree with someone because of the sound of his or her voice.  We often agree with someone because of the sound of his or her voice.

On Salvation

In a 1966 interview with Der Spiegel, Heidegger claimed: “Only a god can save us.”  This statement must be revised: Not even a god could save us now.

On Censorial America

In contemporary America, you may be prosecuted and persecuted for what you think, insofar as what you think is available in language.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

A Critique of David Foster Wallace: Part Two: A Supposedly Fun Thing That I Will Never Do Again / “E Unibus Pluram: Television and U.S. Fiction” / “Getting Away from Already Being Pretty Much Away from It All” / “David Lynch Keeps His Head”

TO READ MY NOVEL TABLE 41, CLICK THE IMAGE ABOVE.

An Analysis of A SUPPOSEDLY FUN THING THAT I WILL NEVER DO AGAIN (David Foster Wallace) by Joseph Suglia

I have written it before, and I will write it again: Writing fictionally was not one of David Foster Wallace’s gifts.  His métier was, perhaps, mathematics.  David Foster Wallace was a talented theorist of mathematics, it is possible (I am unqualified to judge one’s talents in the field of mathematics), but an absolutely dreadful writer of ponderous fictions (I am qualified to judge one’s talents in the field of literature).

Wallace’s essay-aggregate A Supposedly Fun Thing that I Will Never Do Again (1997) is worth reading, if one is an undiscriminating reader, but it also contains a number of vexing difficulties that should be addressed.  I will focus here upon the two essays to which I was most attracted: “E Unibus Pluram: Television and U.S. Fiction” and “David Lynch Keeps His Head,” a conspectus on the director’s cinema from Eraserhead (1977) until Lost Highway (1997).  Wallace seems unaware of Lynch’s work before 1977.

In “E Unibus Pluram,” Wallace warmly defends the Glass Teat in the way that only an American can.  He sees very little wrong with television, other than the fact that it can become, in his words, a “malignant addiction,” which does not imply, as Wallace takes pains to remind us, that it is “evil” or “hypnotizing” (38).  Perish the thought!

Wallace exhorts American writers to watch television.  Not merely should those who write WATCH television, Wallace contends; they should ABSORB television.  Here is Wallace’s inaugural argument (I will attempt to imitate his prose):

1.) Writers of fiction are creepy oglers.

2.) Television allows creepy, ogling fiction-writers to spy on Americans and draw material from what they see.

3.) Americans who appear on television know that they are being seen, so this is scopophilia, but not voyeurism in the classical sense. [Apparently, one is spying on average Americans when one watches actors and actresses on American television.]

4.) For this reason, American writers can spy on other Americans without feeling uncomfortable and without feeling that what they’re doing is morally problematical.

Wallace: “If we want to know what American normality is – i.e. what Americans want to regard as normal – we can trust television… [W]riters can have faith in television” (22).

“Trust what is familiar!” in other words.  “Embrace what is in front of you!” to paraphrase.  Most contemporary American writers grew up in the lambent glow of the cathode-ray tube, and in their sentences the reader can hear the jangle and buzz of television.  David Foster Wallace was wrong.  No, writers should NOT trust television.  No, they should NOT have faith in the televisual eye, the eye that is seen but does not see.  The language of television has long since colonized the minds of contemporary American writers, which is likely why David Foster Wallace, Chuck Klosterman, and Jonathan Safran Foer cannot focus on a single point for more than a paragraph, why Thomas Pynchon’s clownish, jokey dialogue sounds as if it were culled from Gilligan’s Island, and why Don DeLillo’s portentous, pathos-glutted dialogue sounds as if it were siphoned from Dragnet.

There are scattershot arguments here, the most salient one being that postmodern fiction canalizes televisual waste.  That is my phrasing, not Wallace’s.  Wallace writes, simply and benevolently, that television and postmodern fiction “share roots” (65).  He appears to be suggesting that they both sprang up at exactly the same time.  They did not, of course.  One cannot accept Wallace’s argument without qualification.  To revise his thesis: Postmodern fiction–in particular, the writings of Leyner, DeLillo, Pynchon, Barth, Apple, Barthelme, and David Foster Wallace–is inconceivable outside of a relation to television.  But what would the ontogenesis of postmodern fiction matter, given that these fictions are anemic, execrably written, sickeningly smarmy, cloyingly self-conscious, and/or forgettable?

It did matter to Wallace, since he was a postmodernist fictionist.  Let me enlarge an earlier statement.  Wallace is suggesting (this is my interpretation of his words): “Embrace popular culture, or be embraced by popular culture!”  The first pose is that of a hipster; the second pose is that of the Deluded Consumer.  It would be otiose to claim that Wallace was not a hipster, when we are (mis)treated by so many hipsterisms, such as: “So then why do I get the in-joke? Because I, the viewer, outside the glass with the rest of the Audience, am IN on the in-joke” (32).  Or, in a paragraph in which he nods fraternally to the “campus hipsters” (76) who read him and read (past tense) Leyner: “We can resolve the problem [of being trapped in the televisual aura] by celebrating it.  Transcend feelings of mass-defined angst [sic] by genuflecting to them.  We can be reverently ironic” (Ibid.).  Again, he appears to be implying: “Embrace popular culture, or be embraced by popular culture!”  That is your false dilemma.  If you want others to think that you are special (every hipster’s secret desire), watch television with a REVERENT IRONY.  Wallace’s hipper-than-thou sanctimoniousness is smeared over every page.

Now let me turn to the Lynch essay, the strongest in the collection.  There are several insightful remarks here, particularly Wallace’s observation that Lynch’s cinema has a “clear relation” (197) to Abstract Expressionism and the cinema of German Expressionism.  There are some serious weaknesses and imprecisions, as well.

Wallace: “Except now for Richard Pryor, has there ever been even like ONE black person in a David Lynch movie? … I.e. why are Lynch’s movies all so white? … The likely answer is that Lynch’s movies are essentially apolitical” (189).

To write that there are no black people in Lynch’s gentrified neighborhood is to display one’s ignorance.  The truth is that at least one African-American appeared in the Lynchian universe before Lost Highway: Gregg Dandridge, who is very much an African-American, played Bobbie Ray Lemon in Wild at Heart (1990).  Did Wallace never see this film?  How could Wallace have forgotten the opening cataclysm, the cataclysmic opening of Wild at Heart?  Who could forget Sailor Ripley slamming Bobbie Ray Lemon’s head against a staircase railing and then against a floor until his head bursts, splattering like a splitting pomegranate?

To say that Lynch’s films are apolitical is to display one’s innocence.  No work of art is apolitical, because all art is political.  How could Wallace have missed Lynch’s heartlandish downhomeness?  How could he have failed to notice Lynch’s repulsed fascination with the muck and the slime, with the louche underworld that lies beneath the well-trimmed lawns that line Lynch’s suburban streets?  And how could he have failed to draw a political conclusion, a political inference, from this repulsed fascination, from this fascinated repulsion?

Let me commend these essays to the undiscriminating reader, as unconvincing as they are.  Everything collected here is nothing if not badly written, especially “Getting Away from Already Being Pretty Much Away from It All,” a hipsterish pamphlet about Midwestern state fairs that would not have existed were it not for David Byrne’s True Stories (1986), both the film and the book.  It is my hope that David Foster Wallace will someday be remembered as the talented mathematician he perhaps was and not as the brilliant fictioneer he certainly was not.

Joseph Suglia

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OLD AND WOULD LIKE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION, CLICK THE IMAGE BELOW.

V. by Thomas Pynchon * Thomas Pynchon V Analysis * Inherent Vice Thomas Pynchon

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION!

ORIGINALLY PUBLISHED IN THE FACTS ON FILE COMPANION TO THE AMERICAN NOVEL

An Analysis of V. (Thomas Pynchon) by Joseph Suglia

“Suppose truth were a woman…”
–Friedrich Nietzsche, Beyond Good and Evil

All readers undergo a voyage to discover hidden meanings–a voyage which is also a passage of self-discovery.  Like most meta-fictional narratives, Thomas Pynchon’s first novel, V. (1963) is about the act of reading itself and the possibility or impossibility of self-reading.

Never has reading seemed so lugubrious.  The plot concerns Stencil, the son of a now-deceased British foreign officer, who, accompanied by eponymous “schlemihl” Benny Profane, half-heartedly searches for the elusive “V.”–who might be a woman, a thing, a concept, a sewer rat, or nothing at all.  Stencil is a reader, broadly understood: He attempts to interpret the meaning of an initial.  Reading is here a process without progress and without terminus: Stencil never succeeds in identifying the initial’s referent.  As his name implies, Stencil can only trace the outlines of that which he seeks; his search is, to a certain extent, a fruitless yearning for truth.

To put an end to the process of reading would be to lose one’s human spontaneity.  For this reason, “V.” must never be found.  If “V.” were found, Stencil would become indistinguishable from an inanimate object.  The search for “V.” is the only thing that distinguishes him from a thing: “His random movements before the war had given way to a great single movement from inertness to–if not vitality, then at least activity” [55].  Both Profane and Stencil are terrified of the world of objects.  They fear their stasis, their contagious inanimateness.  The inanimate objects that populate Pynchon’s narrative often resemble human beings, such as the beer tap that is shaped in the form of a “foam rubber breast” [16].  Human beings, conversely, are themselves often functional and machinelike: e.g., Benny Profane’s jaunts resemble the idiotic up-and-down movements of a yo-yo; Rachel’s words are described as “inanimate-words [Profane] couldn’t really talk back at” [27], etc.  All of the “characters” in the novel are threatened by the lifeless world of things.  Stencil needs to search for the inaccessible in order to separate himself from the inanimateness of objecthood, in order to avoid freezing into a thingly state: “He tried not to think, therefore, about any end to the search. Approach and avoid” [55].  If “V” were found, it would be necessary to lose it again and to reinitiate the search.

Readers are implicated in this impossible quest, involuntarily placed in the position of code-breakers.  Like Stencil, they obsessively ask themselves, “Who, then, is V.?”  Because the identity of “V.” is never completely given, the solution to the code seems to withdraw abyssally into darkness.  Without an answerable meaning, the “alien hieroglyphic[-]” [17] seems to exist on its own terms.  The book’s center, it would seem, is not some intentional content that would lie behind or beyond the code, but, rather, the code itself.  The cipher itself is illuminated, not its meaning.  The point of interpretation is no longer to identify a transcendental meaning or theme, but rather to sift through the fragments and details of the narrative, the ill-fitting pieces of a jigsaw puzzle.  The unanswerable question “Who, then, is V.?” incites us to return to the forgotten or neglected world of appearances.  Bluntly stated, the disconnected pieces of Pynchon’s narrative are what is essential, not the “whole” to which they would belong.

Pynchon’s novel is an anti-adventure story about the plight of reading.  It challenges us to interpret something–the initial “V.”–without thinking in the categories of totality or universality.  The particular clues in the story do not relate to the universal.  Any interpretation that thinks in the language of totality or universality, in this context, is doomed to failure.

V. concerns the failure of reading and self-reading.  Stencil’s obsessive yet ultimately grim and joyless quest is to discover his own provenance (the search for “V.” is, to a certain extent, the search for his own father, der Vater in German) and therefore to discover his own identity.  And yet there is no definitive conclusion to the process of self-reading; therefore, there is no definite self-understanding.  Stencil’s identity is determined by the impossible which he seeks: “[H]e was quite purely He Who Looks for V.” [225].  If this process had any finality, he would be nothing at all–that is to say, nothing more than a thing, one thing among others.

The task of reading, then, must remain an infinitely provisional task.  Brenda remarks to Profane in Malta: “‘You’ve had all these fabulous experiences. I wish mine would show me something.’ / ‘Why.’ / ‘The experience, the experience. Haven’t you learned?’ / Profane didn’t have to think long. ‘No,’ he said, ‘offhand I’d say I haven’t learned a goddamn thing'” [454].  Stencil and Profane are led on an issueless quest–as are those of us who follow them.  The absence of anything like a decipherable meaning forces us to think about why we read: The book reveals our desire to discover order in chaos, to impose structure and coherence on entropy (disorder and stasis), to implement systems where there is none.

According to the metaphorics of V., the search for meaning is more imperative than the meaning that is sought.  Such is the significance of the non-questions that populate the book–questions that are unshelled of the interrogative form: “What are you afraid of” [36]; “Do you like it here” [40], etc.  These questions without questions remind us that, when approaching this book, we must pose questions without hankering after results.  The question is its own answer.  The answer is the question’s misfortune.

P.S. The novel has a sterile, lifeless prose style.

Dr. Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

THE HISTORY OF LOVE by Nicole Krauss | A Negative Review | Nicole Krauss, Natalie Portman, and Jonathan Safran Foer Emails

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION!

An Analysis of THE HISTORY OF LOVE (Nicole Krauss)

by Dr. Joseph Suglia

Nicole Krauss’s The History of Love (2005) would have been better titled The History of Stupidity.  Much in the way her contemporaries and congeners approach their readerships, Krauss approaches her readership with contempt (i.e. with a set of low expectations).  Most Americans, after all, are gum-chewing television-watchers who have never picked up a book in their lives.  I certainly do not believe this tiresome cliche, but the American publishing industry does.  And so does Nicole Krauss.

Krauss panders.  She explains everything to the reader.  In the end, the reader feels insulted for being treated with such contempt.  I am not fooled by the novel’s pretensions at experimentalism (this is NOT a formally challenging novel).  Yes, we are presented with three interlocking narratives: one written by an old man, another written by the woman he loves, and the other by a fourteen-year-old girl.  But the plot is hideously simplistic: An old man writes a book inspired by his inamorata, Alma.  The book gets away from him.  Alma reads the book.  Fin.

Krauss has mastered the marketing strategies of her erstwhile husband, the celebrity-obsessed Jonathan Safran Foer, who also uses the interlocking narrative structure, a superabundance of nearly-blank pages, and narrators who are functionally illiterate.  In the end, The History of Stupidity feels as if it were a self-advertisement–not so much an advertisement for the author as an advertisement for itself.  Much like the object of SUV commercials, the target audience here is painfully clear: Typical Dumb Americans who find sweet old men and little girls stupidly charming.  Again, it is not I who believe this tiresome cliche.  It is Nicole Krauss.

Not merely is the novel infantile from a formal perspective; the content is also similarly stunted.

Particularly stunning are Krauss’s scatological obsessions.  I am not suggesting that authors should not take scatology as their subject (Roland Topor created a masterly play on coprophilia entitled Leonardo Was Right), nor am I attacking the book on some pseudo-moralistic, Medvedian ground.  H. G. Wells assailed James Joyce (whose name is showcased, pointlessly, twice in this novel) for the latter’s so-called “cloacal obsession.”  But though there is scatology in Joyce, it serves a “transcendent” purpose.  In Krauss, however, the references to the excremental point to nothing other than themselves.  Nothing is more infantile than gastrointestinal humor.

And so we have Leo Gursky struggling with a bowel movement on Page Fifteen, “Zvi Litnivoff” defecating on Page Sixty-Nine, and a tzaddik in an outhouse engaging in one of the “coarse miracles of life” on Page 127.  I could go on, but I don’t want to.  Nicole Krauss seems fascinated by excrementality, which seems appropriate, since her book is a steaming mound of yellow horse-dung.

One last thing: If Leo Gursky has written such an important book, why are all of the passages cited halting and puerile?

What we are witnessing is the “dumbing-down” of literary fiction.  We need a new constructivism (I do not use this word in its traditional sense), after three decades of infantilism in American letters.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

Dave Eggers is a Bad Writer / A review of YOUR FATHERS, WHERE ARE THEY? AND YOUR PROPHETS, DO THEY LIVE FOREVER? (Dave Eggers) by Dr. Joseph Suglia

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, FEEL FREE TO CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION!

A review of YOUR FATHERS, WHERE ARE THEY?  AND THE PROPHETS, DO THEY LIVE FOREVER? (Dave Eggers)

by Dr. Joseph Suglia

One of the most important claims of anti-foundationalism–what is usually called “postmodernism,” the making-fashionable of anti-foundationalism–is that nothing has a single, unified meaning and that systems that pronounce single, unified meanings are fascistic.  Anti-foundationalist writing / film opens and multiplies meanings.  No matter what you say about an anti-foundationalist work of art, you will be wrong: Another interpretation is always possible.  We are all familiar with the rapid occlusions of commercial writing / film–once an alternative meaning appears, it is just as quickly shut out.

Dave Eggers is sometimes referred to, erroneously, as a “postmodern” writer.  It is important to correct this misinterpretation.  Dave Eggers is not a “postmodern” (read: anti-foundationalist) writer.  He is a lazy, slovenly commercial writer who has an unattractive prose style.

Eggers’s most recent catastrophe, Your Fathers, Where Are They? And the Prophets, Do They Live Forever? (2014), could have been written in two hours.  It is entirely composed of dialogue–an easy move for a lazy writer such as Eggers.

The dialogic novel is certainly nothing new.  The dialogic form can be found in Evelyn Waugh’s Vile Bodies (1930), Henry Green’s Nothing (1950), Charles Webb’s 1963 novel The Graduate, and Natalie Sarraute’s satirical novel Les Fruits d’or (1964).  John Fowles’s A Maggot (1985) qualifies, though it is not entirely told in dialogic form.  Zora Neale Hurston’s Their Eyes Were Watching God (1937) is, arguably, a quasi-dialogic novel.  There has never been a stronger novel in this subgenre than the great Roland Topor’s Joko’s Anniversary (1969) (in French: Joko fête son anniversaire), one of the most underrated novels ever published.  And of course, there is Chapter Fifteen of Joyce’s Ulysses (the so-called “Circe” or “Nighttown” episode).  Sadly, most dialogue-driven novels these days are proto-screenplays.  Since the 1960s, most commercial novels have been proto-screenplays, and this, I would argue, has led to the death of literature.  (For reasons of economy, I cannot pursue this argument here.)

The title is taken from The Book of Zechariah [1:5].  The book’s learnedness ends there.  In a style that owes nothing to Zechariah, Eggers will condemn American Society for not giving Young American Men what they are owed.

Eggers’s prophet is Thomas, a thirty-four-year-old American.  His maleness, his age, and his Americanness are all important to understanding this novel as a cultural document.  Why the name “Thomas”?  We’re supposed to think of Thomas Paine (use contractions, or Eggers will get angry at you).

I write that Thomas is “Eggers’s prophet” because he has the same political convictions as Eggers: The money that the U. S. borrows from China should not be used to subsidize foreign wars, but instead should be used to finance space exploration, education, health care, and public television.  Thomas whimpers:

“You guys fight over pennies for Sesame Street, and then someone’s backing up a truck to dump a trillion dollars in the desert” [42].  This is only one of the many jewels with which Eggers’s novel is bejeweled.

Eggers would like to persuade us that his prophet is a normal, likable young man, but his attempts at making Thomas seem likable and normal are nauseatingly hamfisted.  Thomas is “polite,” “nice,” and “friendly” and says repeatedly that he has no intention of killing anyone.  Because Thomas tells us that he is a “principled” person (on page 7 and then again on page 84, in case we missed it), we are supposed to believe that Thomas is a principled person.  There is very little logos in the novel, but there definitely is a great deal of ethos.

And a great deal of pathos.  Unhappily, all of the pathos is artificial, particularly the pathos that is communicated when Thomas “falls in love” with a woman he sees strolling on a beach.  The emotions in this book have the same relationship to real emotions that the fruit-flavors of chewing-gum have to real fruit.

Eggers would like to persuade us, then, that Thomas is a principled young man who kidnaps an Astronaut, a Congressman, an Overeducated Pederast Teacher, his own Mother, a Police Officer, a “Director of Patient Access,” and a Hot Woman; each of these characters is a lifeless stereotype.  Such a rhetorical strategy would be difficult for even a serious and careful writer and because Eggers is neither (don’t say it with a long I, or Eggers will get angry at you), the outcome resembles a railway accident.

Thomas is an Angry Young Man of the same pedigree as Dylan Klebold, Eric Harris, James Holmes, and Jared Lee Loughner.  And why is he angry?  Because his “friend” Kev never got on the Space Shuttle.  Because Thomas’s life didn’t turn out the way he wanted it to.  Don’t we live in America?  Aren’t Young American Men promised success and happiness?  Thomas rails against the Congressman:

“You should have found some kind of purpose for me” [37].

And: “Why didn’t you tell me what to do?” [Ibid.].

Why, Daddy, why didn’t you tell me what to do?  Why didn’t you “find a place” for me [47]?  Isn’t there a safe and secure place in the world reserved specially for me?  Why doesn’t the world need ME?

It is so sad that Thomas was promised success and happiness (by whom?) and that he never received either (say it with a long E or Eggers will grow irate with you) of these things.  It is so sad that Kev never got on the Space Shuttle.  Thomas unburdens himself to the Congressman: “That just seems like the worst kind of thing, to tell a generation or two that the finish line, that the requirements to get there are this and this and this, but then, just as we get there, you move the finish line” [34].

The world owes us success and happiness, doesn’t it?  And when we don’t get it, we get real angry!  Much of the novel is based on the mistaken idea that Young American Men are entitled to success and happiness.  And Thomas represents all disenfranchised Young American Men.  As Thomas says to the Congressman–his substitute “father”–at the close of the novel:

“There are millions more like me, too.  Everyone I know is like me…  [I]f there were some sort of plan for men like me, I think we could do a lot of good” [210; emphasis mine].

This is the worldview of a stunted, self-pitying, lachrymose adolescent.  It is the worldview of Dave Eggers.

To return to the opening paragraphs of this review: Eggers, hardly an anti-foundationalist writer, thinks that life is essentially simple and that everything should have an unequivocal meaning: “You and I read the same books and hear the same sermons and we come away with different messages,” Thomas laments.  “That has to be evidence of some serious problem, right?” [45].

It has to be!

Perhaps the novel would be endurable if it were well-written, but Dave Eggers is a mushhead with all of the style of a diseased hippopotamus.  He draws from a stock of words that is available to most English-speaking humans.  He writes familiar things in a familiar way.  He has a problem with people who say “either” with a long “I,” but misuses the word “parameter” (twice, by my count).

The spiritlessness with which he writes is dispiriting.  The prose is lenient.  Serpentine sentences are superseded in favor of a simple syntax.  Apparently, I am one of the few people alive who enjoys reading sentences that spread across the page as flourishing trees.

Despite its many flaws, the book will be praised for the same reason that audiences laugh while watching Saturday Night Live: Most human beings are followers and do what they think they are expected to do.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

A review of A HOLOGRAM FOR THE KING (Dave Eggers) by Dr. Joseph Suglia

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, FEEL FREE TO CLICK THE IMAGE ABOVE TO READ WATCH OUT: THE FINAL VERSION!

A review of A HOLOGRAM FOR THE KING (Dave Eggers)

by Dr. Joseph Suglia

All novels may be taxonomized into three categories: There are novels of plot, novels of character, and novels of language.  A novel of plot is driven by a story that could be synopsized without damaging the novel itself.  Simply read an outline of the plot, and there is no reason for you to read the novel.  A novel of character creates–or should create–living-seeming, recognizably human figures.  But these figures, of course, are nothing more than fabrications, nothing more than chimeras that seem to breathe and talk.  A novel of language makes worlds out of words.

Dave Eggers’s A Hologram for the King (2012) is a novel of character, I suppose, but it doesn’t really work as a novel of character.  Nor does it work on any other level.  It must be said of this miserable little drip of a book that it fails as a plot-driven narrative, that it fails as the portrait of a character, and that it fails as a work of language.

PLOT

Eggers has the tendency to write novels that are based on American high-school standards.  You Shall Know Our Velocity–a novel that is as sincere as those fraternity boys who raise money for the homeless–is based on On the Road.  The Circle is based on Nineteen Eighty-FourA Hologram for the King is based on Death of a Salesman and En attendant Godot (the epigraph is from Beckett’s play: “It is not every day that we are needed”).  En attendant Godot is about the stupidities of faith, the stupidities of eschatology, and the infinitely postponed arrival (or non-arrival) of the Messiah.  And yet Egger’s Messiah arrives!  If Eggers wanted a classic about the degradations of growing old on which to model his tale, he should have turned to Bellow.  Henderson the Rain King, anyone?

Alan Clay is a semi-employed fifty-four-year-old former bicycle manufacturer who is contracted by Reliant, a major IT company, to introduce King Abdullah to a holographic projection system.  The inaction takes place in King Abdullah Economic City (KAEC), Saudi Arabia.  Every day, Alan and his enviably young colleagues wait in the desert for the arrival of King Abdullah.

Novels do not need to be realistic, but they ought to be convincing, and the question of probability comes up more than a few times.  If Alan is indeed “superfluous to the forward progress of the world” [75], why is he employed by the largest IT company of that same world and promised $500,000 if he succeeds in persuading King Abdullah to purchase the holographic projection system?

The novel is a novel about late arrivals, and Alan and his “Other” are forever arriving late to the party: Alan is too late to save his neighbor Charlie Fallon from self-drowning, Alan wakes up late on the day of his scheduled meeting with King Abdullah, Alan is “too late” (read: “too old”) to be sexually potent, King Abdullah himself arrives late, etc.  I would advise prospective readers to never arrive.

CHARACTER

As synaesthetes know, everything has a color.  Eggers’s washout is not exactly an iridescent character.  He is relentlessly grey.

A character should be, to paraphrase the Hegel of Die Phänomenologie des Geistes, an assemblage of Alsos.  That is: A character should not be one thing.  A character should not be simple.  A character should not be one-sided.  A character should be this AND ALSO that AND ALSO that.  Each of these traits should contradict one another.  Since human beings are complexly self-contradictory, why should characters not be, as well?

Regrettably, Eggers’s main character is flatter than a Fruit Roll-Up.  Alan is a never-was and has never been anything besides a never-was.

While waiting for King Abdullah, Alan meets (guess who!) two sexually prepossessing young women: a gorgeous blonde Dutch consultant named Henne and a Saudi physician named Zahra Hakem who is intrigued by the knob-like excrescence on the back of his neck.  At one stage, Alan imagines that his cyst has sexual powers.  I could imagine the entire novel centering on the sexuality of Alan’s cyst, but no, that would have been too daring.  This is a Dave Eggers novel, after all.

Each appointment leads to a sexual disappointment.  Henne offers Alan sexual release in the bathtub of her hotel room, but Alan prefers the “purity” and “simplicity” [177] of the bath water instead.  Dr. Zahra swims topless with Alan (this, apparently, is done all of the time in Islamic countries), but her toplessness does not lead to a toplessness-inspired act of sexual release.

Eggers simply cannot let his ageing protagonist be sexually uninteresting to women.  Even though the novel pretends to be an allegory about the downfall of America in an age of globalism, it is really an all-American wish-fulfillment fantasy.  Are we credulous enough to believe that the generously breasted blonde Dutch consultant is sexually desperate?  And that Dr. Zahra lusts after Alan’s knobby cyst?  Apparently, Eggers thinks that we are.

LANGUAGE

Eggers is more of a summarizer than he is a dramatizer.  He tells more than he shows.  An example (from the novel’s opening salvo):

[Alan] had not planned well.  He had not had courage when he needed it.  /  His decisions had been short sighted [sic].  /  The decisions of his peers had been short sighted [sic].  /  These decisions had been foolish and expedient.  /  But he hadn’t known at the time that his decisions were short sighted [sic], foolish or expedient.  He and his peers did not know that they were making decisions that would leave them, leave Alan, as he now was–virtually broke, nearly unemployed, the proprietor of a one-man consulting firm run out of his home office [4].

Now, a hard-working writer would do the grueling work of showing us Alan’s failures and shortcomings rather than telling us about Alan’s failures and shortcomings.  Eggers is less of a writer than a publicist.  The passage quoted above reads as if it came from a query-letter addressed to a literary agent.

Wading through the brackish waters and the fetid marshlands of Eggers’s prose is not much fun.  I never once got the impression that the writer was groping for the right word.  To say that Eggers’s prose style wants elegance and richness would be a gross understatement.  His word choices are banal and obvious, his vocabulary is restricted, his writing style is plain, his paragraphs are dull.  To describe Alan’s dispute with Banana Republic over a one-time purchase that has killed his credit-score, Eggers writes, doltishly, “Alan tried to reason with them” [138].  This sentence could not have been written any more unpoetically and is yet another instance of the lazy “telling” of an unqualified writer rather than of the laborious “showing” which is incumbent on every responsible writer of fiction.

Eggers’s writing is so bad that it is almost ghoulish.

I have heard it said of Eggers that he is a man who is “easy on the eyes,” and I have no doubt that this is true.  (His lecteurial admirers have a purely phenomenal interest in the writer.  That is to say, they don’t care about the writing; they are only interested in the writer qua man.)  Though I am not an adroit evaluator of male beauty, I suspect that Eggers-the-Man is indeed “easy on the eyes.”  It is a pity that the same could not be said of the books that he types.

Dr. Joseph Suglia

PLEASE CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

Corregidora / Corrigenda – by Joseph Suglia

IF YOU ARE AT LEAST TWENTY-EIGHT YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION.

Corregidora / Corrigenda

by Joseph Suglia

A typical response to genocide is the injunction to remember.  All of us have heard the words “Never forget!” in reference to the Shoah.  Most are familiar with Kristallnacht, with the Names Project, also known as “the AIDS Quilt.”  The March for Humanity memorializes the mass-murder of Armenians by Ottoman Turks.  Every year, at this time in April, the Rwandan government urges its citizens to kwibuka—the Rwandan word for “to remember.” To kwibuka, to remember the countless Tutsis who were slaughtered in the massacre of 1994.

But how should one respond when genocide is misremembered?  Is the misremembrance of genocide superior to the forgetting of genocide?

Which is worse, distortion or oblivion?

Is it worse to minimize, for example, the number of Armenians who were killed at the beginning of the twentieth century, or to forget that the genocide of Armenians ever occurred?

The most dominant medium of the twentieth century was the cinema, and the cinema still has the power to shape, and to misshape, collective memory.

Over the past seven years, a talentless hack filmmaker named Quentin Tarantino has manufactured films that I would not hesitate to describe as “genocide pornography.”  That is to say, these are films that would turn genocide into an object of consumption, an object of enjoyment.  These are also films that disfigure historical consciousness.

Thanks to Quentin Tarantino, the succeeding generation might believe that the Jews defeated the Nazis.  Thanks to Quentin Tarantino, they might believe that Hitler was assassinated.  They might believe that, in general, African slaves rose up and overcame their enslavers.  They might believe that every African slave in antebellum America was a free agent.  Not an insurrectionist like Nat Turner, but an action figure like Django.

But what if misremembrance were not a disfiguration or a distortion of memory?  What if misremembrance plays a constitutive and formative role in memory itself?

Freudian psychoanalysis has something to say about the interpenetration of remembrance and misremembrance.

At the earliest stage of his career, between the years 1895 and 1897, Freud formulated what is called “seduction theory.”  Seduction theory is based on the idea that sexual trauma is pathogenic—that is, that sexual abuse produces neuroses.

Freud rejected seduction theory in 1897, but this does not mean that he silenced the voices of abused children.  From the beginning of his career until its end, Freud never ceased to emphasize that sexual trauma has pathological effects.

Why did Freud reject seduction theory?  Because it was too linear, too simple, because it did not take into consideration the supremacy of the unconscious.

The memory of sexual trauma, Freud recognized, might be repressed, sublimated, externalized, transferred, reintrojected, reimagined, or fictionalized.

This does not mean that when children claim that they have been sexually abused, they are lying.  It means, rather, that experiences of abuse pass through the imagination and the imagination passes through the unconscious.  Seduction theory did not take the imagination—die Phantasie—into account and therefore had to be abandoned.

The unconscious, as Freud wrote to Wilhelm Fleiss, does not distinguish between fact and fantasy.

It is difficult for a victim of abuse to acknowledge his or her trauma directly, and Freud knew this.  Sexual trauma, after it occurs, does not manifest itself directly or immediately, but epiphenomenally—that is to say, symptomatically.  It shows itself in disguise.  It dramatizes itself.  It retraumatizes.  It might be phantasmatically reconstituted.

From the Freudian standpoint, remembrance and misremembrance are not mutually exclusive.

There is a third form of misremembrance that I would like to pause over.  It is the kind of anamnesis or déjà vu when an individual recollects not her own individual history, but the history of past generations, the history of her ancestors.  Cultural memory, seen from this perspective, would be a form of misremembrance.

Such misremembrance could only be figured in art.

The literature of Gayl Jones reminds us that the remembrance of personal trauma always contains a cultural dimension, that all memory is misremembrance.

The past that you have experienced is not the past that you remember.

When I first heard the title of Jones’s first novel — Corregidora  (published in 1975) — I thought it was “corrigenda.”

Corrigenda: a list of errors in a published manuscript.

* * * * *

At the novel’s opening, lounge singer Ursa Corregidora is shoved down a staircase by her husband, Mutt — a catastrophic blow that results in her infertility. After she renounces her husband, Ursa enters into a relationship with Tadpole, the owner of the Happy Café, the bar at which she performs. Like all of her significant relationships with men, this second relationship proves disastrous and is doomed to failure.

Every man in the novel, without exception, sees Ursa as a “hole” — that is, as a beguiling and visually appealing receptacle to be penetrated. The narrative suggests this on the figural level. A talented novelist, Jones weaves images of orifices throughout her text — tunnels that swallow and tighten around trains, lamellae such as nostrils, mouths, wounds, etc. Although one of Ursa’s “holes” is barren, another “hole” is bountifully “prosperous”  — her mouth, from which the “blues” issue. A movement of sonic exteriorization corresponds to a counter-movement of physiological interiorization.

It is easy to be trapped by these more immediate, socio-sexual dimensions of the narrative. Corregidora might seem, prima facie, to be nothing more than another novel about a woman imprisoned in abusive and sadistic relationships with appropriative men. But the meanings of Corregidora are far more profound than this.  A “transcendental” framework envelops the immediate narrative and casts it in relief, thereby enhancing its significance.  We learn that Ursa is the great-granddaughter of Portuguese slave-trader and procurer Corregidora, who sired both Ursa’s mother and grandmother.  Throughout the course of the novel, the men in Ursa’s life take on a resemblance to Corregidora — and this resemblance sheds light on both the sexual basis of racism and the tendency of some oppressed cultures to take on the traits of imperialist hegemonies.  According to the logic of the novel, the children of slaves resemble either slaves or slave drivers.  Even within communities born of slavery, the novel suggests, there persist relationships of enslavement.  “How many generations had to bow to his genital fantasies?” Ursa asks at one point, referring to Corregidora the Enslaver.  As long as hierarchical relationships form between men and women in the African-American community, Jones’s novel suggests, there will never be an end to this period of acquiescence; Corregidora will continue to achieve posthumous victories.

As long as hierarchical relationships form between men and women in the African-American community, the novel suggests, the enslavers will continue to achieve posthumous victorious.

As long as hierarchical relationships form between men and women in the African-American community, the novel suggests, the segregationists and the white supremacists will continue to achieve posthumous victories.

To return to the opening statement of this essay: A typical response to genocide is the injunction to remember. Although her infertility robs Ursa of the ability to “make generations” — something that, she is taught, is the essence of being-woman — she can still “leave evidence,” can still attest to the historical memory of slavery.  All documents that detailed Corregidora’s treatment of his slaves were seemingly destroyed, as if the abolition of slavery abolished memory itself.  According to the injunction of the Corregidora women (Ursa’s ancestors), one must testify, one must re-member, one must “leave evidence.”  And yet memory is precisely Ursa’s problem.  Memory cripples her.  Throughout the novel, Ursa struggles to overcome the trauma of her personal past.  And this past — in particular, the survival in memory of her relationship with Mutt — belongs to the larger, communal past that is her filial legacy.  Her consciousness is rigidified, frozen in the immemorial past of the Corregidora women.  This “communal” past is doomed to repeat itself infinitely, thus suspending the presence of the present — and, in particular, Ursa’s individual experience of the present.  Her individual experience of the present is indissociably married to her personal past, and her most intimate past is, at the same time, also the past of her community.  The words that Ursa uses to describe her mother could also apply to Ursa herself: “It was as if their memory, the memory of all the Corregidora women, was her memory too, as strong within her as her own private memory, or almost as strong.”

At the shocking and unforgettable close of the novel, the past and present coincide almost absolutely.  When, after twenty-two years of estrangement, Ursa is reunited with her first husband, the historical memory of slavery is superimposed and mapped onto their relationship. Both Ursa and Mutt become allegorical figures, each representing slave and slaveholder, respectively.  The present-past and the past-present reflect each other in an infinite mirror-play until they both become almost indistinguishable from each other.

At the juncture of both temporalities is an inversion of power relations that comes by way of a sex act.  Ursa performs fellatio on her first husband.  Oral sex replaces oral transmission.  Here we have the perpetuation of a traumatic past, and yet it is a repetition with a difference.  Fellatio is disempowering for the man upon whom it is performed; dangerously close to emasculation, it is experienced as “a moment of broken skin but not sexlessness, a moment just before sexlessness, a moment that stops just before sexlessness.”  For the woman, by contrast, it might be an act vacant of all sensuality, one that is abstracted of all emotional cargo.  Fellatio might infuse the performer with a feeling of power’s intensification; its objective might not be the enhancement of erotic pleasure, but of the pleasure that comes with the enhancement of one’s feeling of power.

By playing the role of the guardian of memory, Ursa dramatizes the intersection of her individual past with a communal past.  The paralysis of historical consciousness sets in: “My veins are centuries meeting.”

End of quotation, and the end of the essay.

Dr. Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41: