Three Aperçus: On DEADPOOL (2016), David Foster Wallace, and Beauty

CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION!

Three Aperçus: On DEADPOOL (2016), David Foster Wallace, and Beauty

by Joseph Suglia

Deadpool (2016) is capitalism with a smirking face.

David Foster Wallace was not even a bad writer.

Beauty is the one sin that the Average American of today cannot forgive.

*

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

Two Aperçus: THE NEON DEMON (2016)

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION.

Two aperçus

The Neon Demon (2016) is a snuff film in which art is murdered.

Descent (2007) is superior to The Neon Demon because the former has an Aristotelian structure–which works.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

 

HOUSE OF LEAVES by Mark Z. Danielewski / WHEN DID WRITING STOP HAVING TO DO WITH WRITING? – by Dr. Joseph Suglia

CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION, IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE.

WHEN DID WRITING STOP HAVING TO DO WITH WRITING?
by Dr. Joseph Suglia

When did writing stop having to do with writing?  Of the many attempts to communalize literature, none is more dangerous than the sway of the current ideology: the consensus, and consciousness, that writing has nothing to do with writing.  You will hear readers talk about “plot” (in other words, life).  You will hear them talk about the “author.”  But writing?  Writing has nothing to do with writing.  No one cares whether a book is well-written anymore.

* * * * *

Mark Z. Danielewski is not very much interested in language.  He cares more about graphics than he does about glyphs.  No words live in his House of Leaves.  It is a house of pictures, not of words.  It is a house in which words only exist as blocks of physical imagery.

Allow me to cite a few not unrepresentative sentences/fragments from House of Leaves:

1.) “A hooker in silver slippers quickened by me” [296].  Danielewski, scholar, thinks that “to quicken” means “to move quickly.”

2.) “Regrettably, Tom fails to stop at a sip” [320].  I convulse in agony as I read this sentence.

3.) “Pretentious,” too often, is American for “intelligent.”  It is a word that is often misapplied.  However, in the case of House of Leaves, it must be said that Danielewski uses German pretentiously.  In a book that is littered with scraps of the German language, shouldn’t that language be used properly?  “der absoluten Zerissenheit” [sic; 404 and elsewhere — a Heideggerean citation] should read “die absolute Zerissenheit“–the genitive is never earned.  “unheimliche vorklaenger” [sic; 387] should read “unheimliche Vorklänge” and does not mean “ghostly anticipation.”  Whenever Danielewski quotes the German, he is being pretentious–that is, he is pretending to know things of which he knows nothing.

It is impossible to escape the impression that Mark Z. Danielewski does not want to be read.  Noli me legere = “Do not read me.”  The House of Leaves is a book at which to be looked, not one that is to be read.  Its sprawling typographies and fonts distract the reader from the impoverished prose.

Words are reduced to images, to pictures.

* * * * *

When did writing stop having to do with writing?  When novels became precursors to screenplays.  The terminus ad quem is 1963, with the publication of Charles Webb’s The Graduate.  The novel is a proto-screenplay, as was Ira Levin’s Rosemary’s Baby, published in 1967.  The film studio (William Castle Enterprises) optioned the novel even before Levin finished writing it!  Astoundingly, Rosemary’s Baby, according to my interpretation, is a novel about the diabolical essence of the Hollywood entertainment industry!

With the rise of mainstream cinema came the denigration of literature.  The visual overthrew the verbal.  Around the same time, imaginative prose began to be dumbed well down.  There are two infantile reductions at work, both of which are visible in House of Leaves: a dumbing-down of language and an accent on the optical (as opposed to the verbal).

Such infantile reductions are everywhere in evidence whenever one picks up a contemporary American novel.  We can thank America for the coronation of the idiot and for an all-embracing literary conformism.  Even stronger writers, these days, morosely submit to the prevailing consolidation of a single “literary style.”  A style that, of course, is no style at all.  And these same writers, listlessly and lifelessly, affirm in reciprocal agreement that the construction of a well-wrought sentence isn’t something worth spending time on.  Or blood.

How self-complacent American writers have become!  The same country that produced Herman Melville, William Faulkner, and Saul Bellow has given birth to Mark Z. Danielewski.  Nothing is more hostile to art than a culture of complacency.

There was, I’m sure, something very refreshing about Charles Bukowski in the 1970s, when the vestiges of a literary academism still existed.  Mr. Bukowski, I am assuming, would be dismayed to uncover the kindergarten of illiterate “literati” to which he has illegitimately given birth.  His dauphin, Mark Z. Danielewski.

Weaker students of literature might feel invigorated by the Church of Literary Infantilism, yet even they know that the clergy engenders nothing sacred or profane.  This explains their virulent defensiveness when anyone, such as myself, dares to write well or explore another writer’s engagement with language.  “Writing doesn’t matter,” you see.  They have never luxuriated in the waters of language; they have never inhabited a world of words.  Words don’t interest them; people do.  And literary discussions have degenerated to the level of a bluestockinged Tupperware party.  If you like the main character, the book is “good.”  If a book is warm and friendly, that book is “good.”  If a book reassures you that you are not a slavering imbecile–that is to say, if you can write better than the book’s “author”–that book is “good.”  If a book disquiets you or provokes any kind of thought whatsoever, that book is “bad.”  If a book has an unsympathetic main character, that book is “bad.”  If a book is difficult to understand, that book is “bad,” and so forth and so on.  Whatever exceeds the low, low, low standards of the average readership, in a word, is blithely dismissed as “bad.”

Things grow even more frightening when we consider the following: These unlettered readers are quickly transforming into writers.  That would be fine if they knew how to write.  And if the movements of language were valued, culturally and humanly, their noxious spewings would find no foothold.  The literature of challenge has been supplanted by the litter of the mob, with all of its mumbling solecisms and false enchantments.  The problem with mobs, let us remind ourselves, is that they efface distinctions.  They do everything in their power to make the distinguished undistinguished.  And so instead of James Joyce, we have bar-brawling beefheads (e.g. Chuck Palahniuk), simian troglodytes (e.g. Henry Rollins), and graphic designers / typographists (e.g. Mark Z. Danielewski).

Instead of poeticisms, we have grunts.  We have pictures.  We have graphic design and cinema.

* * * * *

Someone said to me: “I am a good writer, but I don’t know how to spell.”

Someone said to me: “No writer is better than any other.”

* * * * *

America is responsible for the production of more linguistic pig-shit than any other country in the world.  There is absolutely nothing surprising about this statement.  After all, America is the only country that celebrates stupidity as a virtue.  How could things be otherwise?

At the poisonous end of the democratization process, which is indistinguishable from the process of vulgarization, every jackass on the street sees himself as an “author.”  His brother, his grandmother, and his step-uncle: they, too, regard themselves as “authors.”  After all, they think–inasmuch as they are capable of thinking–“Writing has nothing to do with writing.  If Mark Z. Danielewski can be published, so can I!”  (Yes, their desire is “to be published,” as if their lives would be inscribed on the page, disseminated, filmed, and thus rendered meaningful.)  We live in an age of all-englobing and infinitely multiplying cyber-technologies, where stammering imbeciles mass-replicate their infantile scribbles, but let us not deceive ourselves: If a “writer” is simply one who writes, then they are writers; however, one should reserve the word “author” only for those who are profoundly committed to the craft of verbal composition.

* * * * *

Judging from a purely technical point of view, House of Leaves is consistently faulty, fraught with excruciating Hallmark banalities and galling linguistic errors.  Hipster Mark Z. Danielewski is seemingly incapable of composing a single striking or insightful sentence.  It astonishes me that anyone ever considered his tinker-toy bromides to be publishable.  The House of Leaves is a house that is neither well-appointed nor ill-appointed.  It is simply not appointed at all.

* * * * *

Who cares about language anymore?  No one in America even questions the assumption that good writing does not matter.  And this assumption is no longer limited to America–a horrific logophobia is spreading throughout the globe.  The impetuses that motivate this tsunami of “literary” vomit are the following ideological assumptions: The fallacy that 1.) everyone is entitled to be an author (this is a particularly nasty perversion of the democratic principle) and that 2.) the visible improves on the verbal.  American letters have been reduced to the gibbering and jabbering of semiliterate simpletons, driveling half-wits, and slack-jawed middlebrows.  It’s only a matter of time before the English stop caring about language, as well.

When you live in a culture of complacency, a culture of appeasement, a hypocritical culture that assures you that you write well even if you don’t, there is only one way out.  There is nothing for the strong and serious student of literature to do but to write for himself, to write for herself, for his or her own sake.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

David Foster Wallace and Macaulay Culkin: Two aperçus

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION.

David Foster Wallace and Macaulay Culkin: Two aperçus

David Foster Wallace was a sudorific pseudo-author.

Macaulay Culkin only holds one thing in common with the young Lou Reed: a heroin addiction.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

A commentary on HUMAN, ALL-TOO-HUMAN by Nietzsche / MENSCHLICHES, ALLZUMENSCHLICHES: Nietzsche and Sam Harris / Nietzsche on Women / Was Nietzsche a sexist? / Was Nietzsche a misogynist? / Nietzsche and Sexism / Sam Harris and Nietzsche / Sexism and Nietzsche / Misogyny and Nietzsche / Nietzsche and Misogyny / Nietzsche and Sexism / Nietzsche and Feminism / Feminism and Nietzsche / Friedrich Nietzsche on Women / Friedrich Nietzsche and Sam Harris / Is Sam Harris Influenced by Nietzsche?

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION!

HUMAN, ALL-TOO-HUMAN / MENSCHLICHES, ALLZUMENSCHLICHES (Friedrich Nietzsche)

A commentary by Joseph Suglia

MAM = Menschliches, Allzumenschliches. Ein Buch für freie Geister (1878); second edition: 1886

VMS = Vermischte Meinungen und Sprüche (1879)

WS = Der Wanderer und sein Schatten (1880)

The following will not have been an interpretation of Nietzsche’s Human, All-Too-Human.  It will have been a commentary: Comment taire? as the French say.  “How to silence?”  In other words: How should the commentator silence his or her own voice and invisibilize his or her own presence in order to amplify the sound of the text and magnify the text’s image?

An interpretation replaces one meaning with another, or, as Heidegger would say, regards one thing as another.  A commentary adds almost nothing to the text under consideration.

Nietzsche’s Psychological Reductionism and Perspectivalism

Human, All-Too-Human is almost unremittingly destructive.  For the most part, it only has a negative purpose: to demolish structures and systems of thought.  However, there is also a positive doctrine within these pages, and that is the doctrine of total irresponsibility and necessity (to which I will return below) and the promise of a future humanity that will be unencumbered by religion, morality, and metaphysics.

In the preface of the second edition (1886), Nietzsche makes this thrust and tenor of his book clear with the following words: The purpose of the book is “the inversion of customary valuations and valued customs” (die Umkehrung gewohnter Wertschätzungen und geschätzter Gewohnheiten).  The highest ideals are reduced to the basest human-all-too-humanness of human beings.  This is a form of psychological reductionism: Once-good values (love, fidelity, patriotism, motherliness) are deposed.  The man who mourns his dead child is an actor on an imaginary stage who performs the act of mourning in order to stir up the emotions of his spectators—he is vain, not selflessly moral.  The faithful girl wants to be cheated upon in order to prove her fidelity—she is egoistic, not selflessly moral.  The soldier wants to die on the battlefield in order to prove his patriotism—he is egoistic, not selflessly moral.  The mother gives up sleep to prove her virtuous motherliness—she is egoistic, not selflessly moral [MAM: 57].

The inversion of valuations leads to an advocacy of the worst values: vanity and egoism (but never the vaingloriousness of arrogance, against which Nietzsche warns us for purely tactical reasons).  As well as lying.  Nietzsche praises lying at the expense of the truth to the point at which lying becomes the truth, and the truth becomes a lie that pretends that it is true.  This, of course, is a paradox, for anyone who says, “There is no truth, only interpretations of truth” is assuming that one’s own statement is true.

Again and again, Nietzsche phenomenalizes the world.  Appearance (Schein) becomes being (Sein): The hypocrite is seduced by his own voice into believing the things that he says.  The priest who begins his priesthood as a hypocrite, more or less, will eventually turn into a pious man, without any affectation [MAM: 52].  The thing in itself is a phenomenon.  Everything is appearance.  There is no beyond-the-world; there is nothing outside of the world, no beyond on the other side of the world, no επέκεινα.

As far as egoism is concerned: Nietzsche tells us again and again: All human beings are self-directed.  I could have just as easily written, All human beings are selfish, but one must be careful.  Nietzsche does not believe in a hypostatized self.  Every individual, Nietzsche instructs us, is a dividual (divided against himself or herself), and the Nietzsche of Also Sprach Zarathustra (1883-1885) utterly repudiates the idea of a substantialized self.  To put it another way: No one acts purely for the benefit of another human being, for how could the first human being do anything without reference to himself or herself?: Nie hat ein Mensch Etwas gethan, das allein für Andere und ohne jeden persönlichen Begweggrund gethan wäre; ja wie sollte er Etwas thun können, das ohne Bezug zu ihm wäre? [MAM: 133].  Only a god would be purely other-directed.  Lichtenberg and La Rochefoucauld are Nietzsche’s constant points of reference in this regard.  Nietzsche never quotes this Rochefoucauldian apothegm, but he might as well have:

“True love is like a ghost which many have talked about, but few have seen.”

Or:

“Jealousy contains much more self-love than love.”

Whatever is considered “good” is relativized.  We are taught that the Good is continuous with the Evil, that both Good and Evil belong to the same continuum.  Indeed, there are no opposites, only degrees, gradations, shades, differentiations.  Opposites exist only in metaphysics, not in life, which means that every opposition is a false opposition.  When the free spirit recognizes the artificiality of all oppositions, s/he undergoes the “great liberation” (grosse Loslösung)—a tearing-away from all that is traditionally revered—and “perhaps turns [his or her] favor toward what previously had a bad reputation” (vielleicht nun seine Gunst dem zugewendet, was bisher in schlechtem Rufe stand) [Preface to the second edition].  The awareness that life cannot be divided into oppositions leads to an unhappy aloneness and a lone unhappiness, which can only be alleviated by the invention of other free spirits.

What is a “free spirit”?  A free spirit is someone who does not think in the categories of Either/Or, someone who does not think in the categories of Pro and Contra, but sees more than one side to every argument.  A free spirit does not merely see two sides to an argument, but rather as many sides as possible, an ever-multiplying multiplicity of sides.  As a result, free spirits no longer languish in the manacles of love and hatred; they live without Yes, without No.  They no longer trouble themselves over things that have nothing to do with them; they have to do with things that no longer trouble them.  They are mistresses and masters of every Pro and every Contra, every For and every Against.

All over the internet, you will find opposing camps: feminists and anti-feminists, those who defend religious faith and those who revile religious faith, liberals and conservatives.  Nietzsche would claim that each one of these camps is founded upon the presupposition of an error.  And here Nietzsche is unexpectedly close to Hegel: I am thinking of Nietzsche’s perspectivalism, which is, surprisingly, closer to the Hegelian dialectic than most Nietzscheans and Hegelians would admit, since they themselves tend to be one-sided.  In all disputes, the free spirit sees each perspective as unjust because one-sided.  Instead of choosing a single hand, the free spirit considers both what is on the one hand and what is on the other (einerseits—andererseits) [MAM: 292].  The free spirit hovers over all perspectives, valuations, evaluations, morals, customs, and laws: ihm muss als der wünschenswertheste Zustand jenes freie, furchtlose Schweben über Menschen, Sitten, Gesetzen und den herkömmlichen Schätzungen der Dinge genügen [MAM: 34].  It is invidiously simplistic and simplistically invidious to freeze any particular perspective.  Worse, it is anti-life, for life is conditioned by perspective and its injustices: das Leben selbst [ist] bedingt durch das Perspektivische und seine Ungerechtigkeit [Preface to the second edition].  A free spirit never takes one side or another, for that would reduce the problem in question to the simplicity of a fixed opposition, but instead does justice to the many-sidedness of every problem and thus does honor to the multifariousness of life.

There Is No Free Will.  Sam Harris’s Unspoken Indebtedness to Nietzsche.

Let me pause over three revolutions in the history of Western thought.

The cosmological revolution known as the “Copernican Revolution” marked a shift from the conception of a cosmos in which the Earth is the center to the conception of a system in which the Sun is the center.  A movement from geocentrism (and anthropocentrism) to heliocentrism.

The biological revolution took the shape of the theory of evolution (“It’s only a theory!” exclaim the unintelligent designers), which describes the adaptation of organisms to their environments through the process of non-random natural selection.

There is a third revolution, and it occurred in psychology.  I am not alluding to psychoanalysis, but rather to the revolution that predated psychoanalysis and made it possible (Freud was an admirer of Nietzsche).  Without the Nietzschean revolution, psychoanalysis would be unthinkable, and Twitter philosopher Sam Harris’s Free Will (2012) would never have existed.

I am alluding to the revolution that Nietzsche effected in 1878.  It was a silent revolution.  Almost no one seems aware that this revolution ever took place.

It is a revolution that describes the turning-away from voluntarism (the theory of free will) and the turning-toward determinism, and Nietzsche’s determinism will condition his critique of morality.  Nietzschean determinism is the doctrine of total irresponsibility and necessity.

[Let it be clear that I know that Spinoza, Hume, Hobbes, Schopenhauer, et al., wrote against the concept of the free will before Nietzsche.]

The free will is the idea that we have control over our own thoughts, moods, feelings, and actions.  It conceives of the mind as transparent to itself: We are aware in advance of why we do-say-write-think the things that we do-say-write-think.  This idea is false: You no more know what your next thought will be than you know what the next sentence of this commentary will be (if this is your first time reading this text).  It is only after the fact that we assign free will to the sources of actions, words, and thoughts.  Our thoughts, moods, and feelings—e.g. anger, desire, affection, envy—appear to us as isolated mental states, without reference to previous or subsequent thoughts, moods, and feelings: This is the origin of the misinterpretation of the human mind known as “the free will” (the definite article the even suggests that there is only one).  The free will is an illusion of which we would do well to disabuse ourselves.

We do not think our thoughts.  Our thoughts appear to us.  They come to the surfaces of our consciousness from the abysms of the unconscious mind.  Close your eyes, and focus on the surfacings and submersions of your own thoughts, and you will see what I mean.

This simple exercise of self-observation suffices to disprove the illusion of voluntarism.  If your mind is babbling, this very fact of consciousness refutes the idea of free will.  Mental babble invalidates the voluntarist hypothesis.  Does anyone truly believe that s/he wills babble into existence?  Does anyone deliberately choose the wrong word to say or the wrong action to perform?  If free will existed, infelicity would not exist at all or would exist less.  After all, what would free will be if not the thinking that maps out what one will have thought-done-said-written—before actually having thought one’s thought / done one’s deed / said one’s words / written one’s words?

Belief in free will provokes hatred, malice, guilt, regret, and the desire for vengeance.  After all, if someone chooses to behave in a hateful way, that person deserves to be hated.  Anyone who dispenses with the theory of the free will hates less and loves less.  No more desire for revenge, no more enmity.  No more guilt, no more regret.  No more rewards for impressive people who perform impressive acts, for rewarding implies that the rewarded could have acted differently than s/he did.  In a culture that accepted the doctrine of total irresponsibility, there would be neither heroes nor villains.  There would be no reason to heroize taxi drivers who return forgotten wallets and purses to their clients, nor would there be any reason to heroize oneself, since what a person does is not his choice / is not her choice.  No one would be praised, nor would anyone praise oneself.  No one would condemn others, nor would anyone condemn oneself.  Researchers would investigate the origins of human behavior, but would not punish, for the sources of all human thought and therefore the sources of all human behavior are beyond one’s conscious control / beyond the reach of consciousness.  It makes no sense to say / write that someone is “good” or “evil,” if goodness and evilness are not the products of a free will.  There is no absolute goodness or absolute evilness; nothing is good as such or evil as such.  There is neither voluntary goodness nor voluntary evilness.

If there is no free will, there is no human responsibility, either.  The second presupposes the first.  Do you call a monster “evil”?  A monster cannot be evil if it is not responsible for what it does.  Do we call earthquakes “evil”?  Do we call global warming “evil”?  Natural phenomena are exempt from morality, as are non-human animals.  We do not call natural phenomena “immoral”; we consider human beings “immoral” because we falsely assume the existence of a free will.  We feel guilt / regret for our “immoral” actions / thoughts, not because we are free, but because we falsely believe ourselves to be free: [W]eil sich der Mensch für frei halt, nicht aber weil er frei ist, empfindet er Reue und Gewissensbisse [MAM 39].  No one chooses to have Asperger syndrome or Borderline Personality Disorder.  Why, then, should someone who is afflicted with Asperger syndrome or Borderline Personality Disorder be termed “evil”?  No one chooses one’s genetic constitution.  You are no more responsible for the emergence of your thoughts and your actions than you are responsible for your circulatory system or for the sensation of hunger.

Those who would like to adumbrate Nietzsche’s “mature” thought should begin with Human, All-Too-Human (1878), not with Daybreak (1801).  Nietzsche’s critique of morality makes no sense whatsoever without an understanding of his deeper critique of voluntarism (the doctrine of free will): Again, the ideas of Good and Evil only make sense on the assumption of the existence of free will.

Anyone who dispenses with the idea of free will endorses a shift from a system of punishment to a system of deterrence (Abschreckung).  A system of deterrence would restrain and contain criminals so that someone would not behave badly, not because someone has behaved badly.  As Nietzsche reminds us, every human act is a concrescence of forces from the past: one’s parents, one’s teachers, one’s environment, one’s genetic constitution.  It makes no sense, then, to believe that any individual is responsible for what he or she does.  All human activity is motivated by physiology and the unconscious mind, not by Good or Evil.  Everything is necessary, and it might even be possible to precalculate all human activity, through the mechanics of artificial intelligence, to steal a march on every advance: Alles ist notwendig, jede Bewegung mathematisch auszurechnen… Die Täuschung des Handelnden über sich, die Annahme des freien Willens, gehört mit hinein in diesen auszurechnenden Mechanismus [MAM: 106].

If you accept the cruelty of necessity (and is life not cruel, if we have no say in what we think and what we do?), the nobility of humanity falls away (the letter of nobility, the Adelsbrief) [MAM: 107].  All human distinction is devalued, since it is predetermined—since it is necessary.  Human beings would finally recognize themselves within nature, not outside of nature, as animals among other animals.  I must cite this passage in English translation, one which is not irrelevant to this context and one which belongs to the most powerful writing I have ever read, alongside Macbeth’s soliloquy upon learning of his wife’s death: “The ant in the forest perhaps imagines just as strongly that it is the goal and purpose for the existence of the forest as we do, when we in our imagination tie the downfall of humanity almost involuntarily to the downfall of the Earth: Indeed, we are still modest if we stop there and do not arrange a general twilight of the world and of the gods (eine allgemeine Welt- and Götterdämmerung) for the funeral rites of the final human (zur Leichenfeier des letzten Menschen).  The most dispassionate astronomer can oneself scarcely feel the lifeless Earth in any other way than as the gleaming and floating gravesite of humanity” [WS: 14].

The demystification of the theory of free will has been re-presented by Sam Harris, who might seem like the Prophet of the Doctrine of Necessity.  Those who have never read Nietzsche might believe that Dr. Harris is the first person to say these things, since Dr. Harris never credits Nietzsche’s theory of total human irresponsibility.  If you visit Dr. Harris’s Web site, you will discover a few English translations of Nietzsche on his Recommended Reading List.  We know that Dr. Harris’s first book (unpublished) was a novel in which Nietzsche is a character.  We also know that Dr. Harris was a student of Philosophy at Stanford University.  He would therefore not have been unaware of the Nietzschean resonances in his own text Free Will.  Why, then, has Dr. Harris never publically acknowledged his indebtedness to Nietzschean determinism?

Nietzsche Is / Is Not (Always) a Misogynist.

In 1882, Nietzsche was sexually rejected by Lou Andreas-Salome, a Russian intellectual, writer, and eventual psychoanalyst who was found spellbinding by seemingly every cerebral man she met, including Rilke and Paul Ree.  Since the first edition of Human, All-Too-Human was published four years before, Salome’s rejection of Nietzsche cannot be said to have had an impact on his reflections on women at that stage in the evolution of his thinking.

Nietzsche is sometimes a misogynist.  But I must emphasize: He is not always a misogynist.

At times, Nietzsche praises women / is a philogynist.  To give evidence of Nietzsche’s philogyny, all one needs to do is cite Paragraph 377 of the first volume: “The perfect woman is a higher type of human being than the perfect man” (Das volkommene Weib ist ein höherer Typus des Menschen, als der volkommene Mann).  Elsewhere, Nietzsche extols the intelligence of women: Women have the faculty of understanding (Verstand), he writes, whereas men have mind (Gemüth) and passion (Leidenschaft) [MAM: 411].  The loftier term Verstand points to the superiority of women over men.  Here, Nietzsche is far from misogynistic—indeed, he almost seems gynocratic.

Nor is Nietzsche a misogynist, despite appearances, in the following passage—one in which he claims that women tolerate thought-directions that are logically in contradiction with one another: Widersprüche in weiblichen Köpfen.—Weil die Weiber so viel mehr persönlich als sachlich sind, vertragen sich in ihrem Gedankenkreise Richtungen, die logisch mit einander in Widerspruch sind: sie pflegen sich eben für die Vertreter dieser Richtungen der Reihe nach zu begeistern und nehmen deren Systeme in Bausch und Bogen an; doch so, dass überall dort eine todte Stelle entsteht, wo eine neue Persönlichkeit später das Übergewicht bekommt [MAM: 419].

To paraphrase: Nietzsche is saying that the minds of women are fluxuous and not in any pejorative sense.  He means that multiple positions coexist simultaneously in the consciousnesses of women.  Personalities are formed and then evacuate themselves, leaving dead spots (todte Stellen), where new personalities are activated.  This does not mean that the minds of women contain “dead spots”—it means that they are able to form and reform new personalities, which is a strength, not a weakness.  And yet does he not say the same thing about his invisible friends, the free spirits?  Free spirits are also in a state of constant flux, and their fluxuousness, while necessarily unjust to their own opinions, allows them to move from opinion to opinion with alacrity and to hold in their heads multiple opinions at the same time.  Free spirits have opinions and arguments, but no convictions, for convictions are petrific.  Free spirits are guiltless betrayers of their own opinions [MAM: 637] and goalless wanderers from opinion to opinion [MAM: 638].

Why would the substitution-of-one-position-for-another, intellectual inconstancy, be considered as something negative?  Is it not a trait of the free spirit the ability to substitute a new position for an older one with alacrity?  And is the free spirit not Nietzsche’s ideal human being—at least before the overhuman takes the stage?  Such is my main argument: Free-spiritedness is womanliness, and free spirits are womanly, if we accept Nietzsche’s definitions of “free-spiritedness” and of “womanliness.”

This is not to deny the strain of misogyny that runs throughout Nietzsche’s collected writings.  Yes, Nietzsche does write unkind and unjustifiable things about women—some of his statements about women are downright horrible and indefensible.  My objective here is to highlight the polysemy and polyvocality of his writing, its ambiguity.  For a further discussion of Nietzsche’s ambiguous representations of the feminine, consult Derrida’s Spurs, wherein he analyzes the figure of the veil in Beyond Good and Evil.

To say or write that Nietzsche is always a misogynist would be to disambiguate his work—if by “Nietzsche” one is referring to the paper Nietzsche.  (For a series of accounts of Nietzsche as a human being, see Conversations with Nietzsche: A Life in the Words of His Contemporaries, published by Oxford University Press.)  Nonetheless, let us pause over the historical, living human being Friedrich Nietzsche, who was male, and his relation to one historical, living human being, who was female: Marie Baumgartner, the mother of one of Nietzsche’s students and his sometime French translator.  In the original manuscript of Mixed Opinions and Maxims, the first appendix to Human, All-Too-Human, Nietzsche wrote: “Whether we have a serpent’s tooth or not is something that we do not know until someone has put his heel upon us.  Our character is determined even more by the lack of certain experiences than by what we have experienced” [VMS: 36].  In a letter to Nietzsche dated 13 November 1878, Marie Baumgartner wrote: “I would gladly have added to your very striking maxim: ‘a woman or mother would say, until someone puts his heel upon her darling or her child.’  For a woman will not silently allow something to happen to them that in most cases she patiently accepts for herself.”  Nietzsche was so affected by Baumgartner’s rather delicately worded suggestion that he modulated the text to reflect her proposal.  If Nietzsche regarded women as inferior (and he never did), why would he take seriously something that a female reader wrote about his manuscript—so seriously that he modified his manuscript to incorporate her words?  The fact that Nietzsche reflected Marie Baumgartner’s suggestion in the revision of his manuscript is evidence enough that he respected the intelligence of this particular woman—the grain of his own writing confirms that he respected the intelligence of women in general and even considered women in general to be more intelligent than men in general.

Nietzsche Was Not an Atheist, if by “Atheist” One Means “Someone Who Does Not Believe in God.”

Nietzsche tells us, in Paragraph Nine of the first volume, “Even if a metaphysical world did exist, it would be nothing other than an otherness [Anderssein] that would be unavailable and incomprehensible to us; it would be a thing with [purely] negative characteristics.”

My question (which has been inspired by Nietzsche) is the following: Why do we even care about the beyond?  Should questions such as “Is there life after death?” not be greeted with apathy?  Why are we engaged with such questions to begin with?  Do not such questions merit indifference rather than seriousness?

Questions such as “Does God exist?” and “Is there life after death?” cannot be answered scientifically or logically.  We do not require their answers in order to live.  All of us live out our lives without knowing the answers to such questions.  Not merely that: It is entirely possible to live out our lives without ever ASKING or PURSUING such questions—and would we not be better off for not having done so?

Let me put it another way: Do the questions “Why does the world exist?” and “Why is there being rather than nothing?” not presuppose a reason for existing and a reason for being?  I am looking at you, Heidegger.

The Nietzsche of 1878 is not an atheist, if by “atheist” one means “someone who does not believe in God.”  Those who contest the existence of a deity or deities are practicing a form of skiamachy.  Nietzsche, on the other hand, is someone who considers questions about the existence of God, or of any extra-worldly transcendence, to be superfluous.  Otherworldliness is not something that can be discussed, since it is purely negative.

Moreover, the Nietzsche of Human, All-Too-Human is not merely not an atheist.  He is also not a philosopher, if by “philosopher,” we mean someone who speculates about imaginary worlds / is an imaginary world-builder.  Nietzsche will not become a philosopher, speculative or otherwise, until the very end of his period of lucidity, with the doctrines of the Eternal Recurrence of the Always-Same and the Will to Power.

Nietzsche Contradicts Himself.  Often.  But This Is Not a Flaw in His Thinking.

Nietzsche contradicts himself—often—but this is not a flaw in this thinking.  He tells us to stop using the word “optimism” [MAM: 28] and then uses the word himself, without any perceptible irony, in other sections of the book.  After scolding us for believing in heroes, he warmly sponsors the “refined heroism” (verfeinerten Heroismus) of the free spirit who works in a small office and passes quietly into and out of life [MAM: 291].  In Paragraph 148 of the first volume, Nietzsche claims that the poet alleviates (erleichtert) life—this seems to contradict his claim, five paragraphs later, that “art aggravates the heart of the poet” (Die Kunst macht dem Denker das Herz schwer), that listening to Beethoven’s Ninth Symphony infuses the listener with the heavy feeling of immortality, with religious and metaphysical conceptions.  If Nietzsche contradicts himself, and he does, this is because free-spiritedness is multitudinous, multi-perspectival, self-contradictory thinking.  Free-spiritedness is multi-spiritedness.

Aphorisms Inspired by Nietzsche

On Religion and Politics

What is religious is political, and what is political is religious.

On Morality

Morality depends on opportunity.

On Communication

A word means something different to you than it does to me, which means that communication is impossible: Nothing is communicable save the power to communicate the impossibility of communication.  (Nietzsche suggests that the worst alienation is when two people fail to understand each other’s irony.)  Consciousness of this fact would liberate us from the bitterness and intensity of every sensation.

On Interpretation

The mind is geared not toward what has been interpreted, but toward that which has not been interpreted and might not even be interpretable.  Nietzsche: “We take something that is unexplained and obscure to be more important than something that has been explained and made clear” [MAM: 532].

On the Voice

We often disagree with someone because of the sound of his or her voice.  We often agree with someone because of the sound of his or her voice.

On Salvation

In a 1966 interview with Der Spiegel, Heidegger claimed: “Only a god can save us.”  This statement must be revised: Not even a god could save us now.

On Censorial America

In contemporary America, you may be prosecuted and persecuted for what you think, insofar as what you think is available in language.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

A Critique of David Foster Wallace: Part Two: A Supposedly Fun Thing That I Will Never Do Again / “E Unibus Pluram: Television and U.S. Fiction” / “Getting Away from Already Being Pretty Much Away from It All” / “David Lynch Keeps His Head”

TO READ MY NOVEL TABLE 41, CLICK THE IMAGE ABOVE.

An Analysis of A SUPPOSEDLY FUN THING THAT I WILL NEVER DO AGAIN (David Foster Wallace) by Joseph Suglia

I have written it before, and I will write it again: Writing fictionally was not one of David Foster Wallace’s gifts.  His métier was, perhaps, mathematics.  David Foster Wallace was a talented theorist of mathematics, it is possible (I am unqualified to judge one’s talents in the field of mathematics), but an absolutely dreadful writer of ponderous fictions (I am qualified to judge one’s talents in the field of literature).

Wallace’s essay-aggregate A Supposedly Fun Thing that I Will Never Do Again (1997) is worth reading, if one is an undiscriminating reader, but it also contains a number of vexing difficulties that should be addressed.  I will focus here upon the two essays to which I was most attracted: “E Unibus Pluram: Television and U.S. Fiction” and “David Lynch Keeps His Head,” a conspectus on the director’s cinema from Eraserhead (1977) until Lost Highway (1997).  Wallace seems unaware of Lynch’s work before 1977.

In “E Unibus Pluram,” Wallace warmly defends the Glass Teat in the way that only an American can.  He sees very little wrong with television, other than the fact that it can become, in his words, a “malignant addiction,” which does not imply, as Wallace takes pains to remind us, that it is “evil” or “hypnotizing” (38).  Perish the thought!

Wallace exhorts American writers to watch television.  Not merely should those who write WATCH television, Wallace contends; they should ABSORB television.  Here is Wallace’s inaugural argument (I will attempt to imitate his prose):

1.) Writers of fiction are creepy oglers.

2.) Television allows creepy, ogling fiction-writers to spy on Americans and draw material from what they see.

3.) Americans who appear on television know that they are being seen, so this is scopophilia, but not voyeurism in the classical sense. [Apparently, one is spying on average Americans when one watches actors and actresses on American television.]

4.) For this reason, American writers can spy on other Americans without feeling uncomfortable and without feeling that what they’re doing is morally problematical.

Wallace: “If we want to know what American normality is – i.e. what Americans want to regard as normal – we can trust television… [W]riters can have faith in television” (22).

“Trust what is familiar!” in other words.  “Embrace what is in front of you!” to paraphrase.  Most contemporary American writers grew up in the lambent glow of the cathode-ray tube, and in their sentences the reader can hear the jangle and buzz of television.  David Foster Wallace was wrong.  No, writers should NOT trust television.  No, they should NOT have faith in the televisual eye, the eye that is seen but does not see.  The language of television has long since colonized the minds of contemporary American writers, which is likely why David Foster Wallace, Chuck Klosterman, and Jonathan Safran Foer cannot focus on a single point for more than a paragraph, why Thomas Pynchon’s clownish, jokey dialogue sounds as if it were culled from Gilligan’s Island, and why Don DeLillo’s portentous, pathos-glutted dialogue sounds as if it were siphoned from Dragnet.

There are scattershot arguments here, the most salient one being that postmodern fiction canalizes televisual waste.  That is my phrasing, not Wallace’s.  Wallace writes, simply and benevolently, that television and postmodern fiction “share roots” (65).  He appears to be suggesting that they both sprang up at exactly the same time.  They did not, of course.  One cannot accept Wallace’s argument without qualification.  To revise his thesis: Postmodern fiction–in particular, the writings of Leyner, DeLillo, Pynchon, Barth, Apple, Barthelme, and David Foster Wallace–is inconceivable outside of a relation to television.  But what would the ontogenesis of postmodern fiction matter, given that these fictions are anemic, execrably written, sickeningly smarmy, cloyingly self-conscious, and/or forgettable?

It did matter to Wallace, since he was a postmodernist fictionist.  Let me enlarge an earlier statement.  Wallace is suggesting (this is my interpretation of his words): “Embrace popular culture, or be embraced by popular culture!”  The first pose is that of a hipster; the second pose is that of the Deluded Consumer.  It would be otiose to claim that Wallace was not a hipster, when we are (mis)treated by so many hipsterisms, such as: “So then why do I get the in-joke? Because I, the viewer, outside the glass with the rest of the Audience, am IN on the in-joke” (32).  Or, in a paragraph in which he nods fraternally to the “campus hipsters” (76) who read him and read (past tense) Leyner: “We can resolve the problem [of being trapped in the televisual aura] by celebrating it.  Transcend feelings of mass-defined angst [sic] by genuflecting to them.  We can be reverently ironic” (Ibid.).  Again, he appears to be implying: “Embrace popular culture, or be embraced by popular culture!”  That is your false dilemma.  If you want others to think that you are special (every hipster’s secret desire), watch television with a REVERENT IRONY.  Wallace’s hipper-than-thou sanctimoniousness is smeared over every page.

Now let me turn to the Lynch essay, the strongest in the collection.  There are several insightful remarks here, particularly Wallace’s observation that Lynch’s cinema has a “clear relation” (197) to Abstract Expressionism and the cinema of German Expressionism.  There are some serious weaknesses and imprecisions, as well.

Wallace: “Except now for Richard Pryor, has there ever been even like ONE black person in a David Lynch movie? … I.e. why are Lynch’s movies all so white? … The likely answer is that Lynch’s movies are essentially apolitical” (189).

To write that there are no black people in Lynch’s gentrified neighborhood is to display one’s ignorance.  The truth is that at least one African-American appeared in the Lynchian universe before Lost Highway: Gregg Dandridge, who is very much an African-American, played Bobbie Ray Lemon in Wild at Heart (1990).  Did Wallace never see this film?  How could Wallace have forgotten the opening cataclysm, the cataclysmic opening of Wild at Heart?  Who could forget Sailor Ripley slamming Bobbie Ray Lemon’s head against a staircase railing and then against a floor until his head bursts, splattering like a splitting pomegranate?

To say that Lynch’s films are apolitical is to display one’s innocence.  No work of art is apolitical, because all art is political.  How could Wallace have missed Lynch’s heartlandish downhomeness?  How could he have failed to notice Lynch’s repulsed fascination with the muck and the slime, with the louche underworld that lies beneath the well-trimmed lawns that line Lynch’s suburban streets?  And how could he have failed to draw a political conclusion, a political inference, from this repulsed fascination, from this fascinated repulsion?

Let me commend these essays to the undiscriminating reader, as unconvincing as they are.  Everything collected here is nothing if not badly written, especially “Getting Away from Already Being Pretty Much Away from It All,” a hipsterish pamphlet about Midwestern state fairs that would not have existed were it not for David Byrne’s True Stories (1986), both the film and the book.  It is my hope that David Foster Wallace will someday be remembered as the talented mathematician he perhaps was and not as the brilliant fictioneer he certainly was not.

Joseph Suglia

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OLD AND WOULD LIKE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION, CLICK THE IMAGE BELOW.

EXTREMELY LOUD AND INCREDIBLY CLOSE by Jonathan Safran Foer

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION.

A Review of EXTREMELY LOUD AND INCREDIBLY CLOSE (Jonathan Safran Foer) by Joseph Suglia

The literature of September 11, 2001, is never attacked.  When a book speaks of September 11, 2001 (or of terrorism in general), it is more or less guaranteed immunity from criticism; it will, almost inescapably, be greeted with sympathy.

Jonathan Safran Foer’s Extremely Loud and Incredibly Close banks on such sympathy, on such reverence.  The narrative, which concerns a nine-year-old boy named Oskar Schell in search of a key that would unshell the enigma of his dead father (a narrative stolen, in its basic outline, from Günter Wilhelm Grass’s Die Blechtrommel), could have been written entirely without its scattered references to the terrorist interventions.  Nor is this trauma the only one presented in the novel: the others include Hiroshima and Nagasaki, the Staten Island Ferry crash, and the Dresden bombings.  Each disaster is generalized to the point at which what is addressed is not a traumatizing event in its specificity, but historical “trauma” itself and the overcoming of trauma through bereavement-inspired creation.

Oskar, the insufferable brat, attempts to complete the work of mourning for his father, Thomas Schell, Jr., a victim of September 11, by compiling an almanac of self-inflicted wounds, the collage of images and letters which is the book we are “reading”–an almanac which, most likely, is assembled sometime in the indefinite future.  (Thomas Shell, Sr.’s manuscript of 4/12/78 is heavily edited (208-216). Who has done the editing? Almost certainly an older version of his grandson Oskar.)  If the term “reading” even applies.  Whenever a “pregnant” image is described, Foer literally re-presents it in the form of a pictorial image.  When a flock of birds rises into the sky, it is not enough that we read of these birds — we must SEE them as well.  Words may not be left in their invisibility; we are presented with supplementary photographs, illustrations, since mere verbality is not enough.  Indeed, the entire novel oozes with misologos — the mistrust or hatred of language / reason — in relation to both its content and its form.  Photographs, yes, and also a superabundance of blank pages and nearly-blank pages.  Space is not used in the manner it is in the works of Edmond Jabès, for instance.

Typography does not substitute for a well-wrought sentence.  Foer abrogates all responsibilities–most specifically, the responsibility to write well.  Why bother when the pyrotechnics of typography is at his disposal?

As far as the writing is concerned, it is composed of nothing other than mind-numbingly, soul-deadeningly repetitive phrases (“heavy boots,” “raison d’etre,” etc.) and Sunday school platitudes: “Sometimes one simply wants to disappear” (184); “There’s nothing wrong with not understanding yourself” (184); “Everything that’s born has to die, which means our lives are like skyscrapers” (245); “How can you say I love you to someone you love? … It’s always necessary” (314).  Whenever the author writes something that he finds “beautiful” and “true” (165), he congratulates himself on his beauty and truth and tells us that that thing is “beautiful” and “true.”  The entire book reeks of such unearned profundity.  We also learn that most dust is made up of human detritus–a very deep truth indeed, one that Foer also communicates in his essay “Emptiness” (originally published in Playboy) with all of the sanctimoniousness and self-righteousness of the faux naif who serves as the center of the novel, a Sunday-School lecture in which we learn that famous musicians (Ringo Star) and scientists (Stephen Hawking) are unthreateningly approachable: Everything is familiarized.

Perhaps it is wrong to criticize Foer for including so many blank pages in Extremely Loud and Incredibly Close, since the entire book is a vacuum: null space into which readers may project their own meanings.

Dr. Joseph Suglia

CLICK BELOW TO READ MY MASTERPIECE TABLE 41:

THE HISTORY OF LOVE by Nicole Krauss | A Negative Review | Nicole Krauss, Natalie Portman, and Jonathan Safran Foer Emails

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION!

An Analysis of THE HISTORY OF LOVE (Nicole Krauss)

by Dr. Joseph Suglia

Nicole Krauss’s The History of Love (2005) would have been better titled The History of Stupidity.  Much in the way her contemporaries and congeners approach their readerships, Krauss approaches her readership with contempt (i.e. with a set of low expectations).  Most Americans, after all, are gum-chewing television-watchers who have never picked up a book in their lives.  I certainly do not believe this tiresome cliche, but the American publishing industry does.  And so does Nicole Krauss.

Krauss panders.  She explains everything to the reader.  In the end, the reader feels insulted for being treated with such contempt.  I am not fooled by the novel’s pretensions at experimentalism (this is NOT a formally challenging novel).  Yes, we are presented with three interlocking narratives: one written by an old man, another written by the woman he loves, and the other by a fourteen-year-old girl.  But the plot is hideously simplistic: An old man writes a book inspired by his inamorata, Alma.  The book gets away from him.  Alma reads the book.  Fin.

Krauss has mastered the marketing strategies of her erstwhile husband, the celebrity-obsessed Jonathan Safran Foer, who also uses the interlocking narrative structure, a superabundance of nearly-blank pages, and narrators who are functionally illiterate.  In the end, The History of Stupidity feels as if it were a self-advertisement–not so much an advertisement for the author as an advertisement for itself.  Much like the object of SUV commercials, the target audience here is painfully clear: Typical Dumb Americans who find sweet old men and little girls stupidly charming.  Again, it is not I who believe this tiresome cliche.  It is Nicole Krauss.

Not merely is the novel infantile from a formal perspective; the content is also similarly stunted.

Particularly stunning are Krauss’s scatological obsessions.  I am not suggesting that authors should not take scatology as their subject (Roland Topor created a masterly play on coprophilia entitled Leonardo Was Right), nor am I attacking the book on some pseudo-moralistic, Medvedian ground.  H. G. Wells assailed James Joyce (whose name is showcased, pointlessly, twice in this novel) for the latter’s so-called “cloacal obsession.”  But though there is scatology in Joyce, it serves a “transcendent” purpose.  In Krauss, however, the references to the excremental point to nothing other than themselves.  Nothing is more infantile than gastrointestinal humor.

And so we have Leo Gursky struggling with a bowel movement on Page Fifteen, “Zvi Litnivoff” defecating on Page Sixty-Nine, and a tzaddik in an outhouse engaging in one of the “coarse miracles of life” on Page 127.  I could go on, but I don’t want to.  Nicole Krauss seems fascinated by excrementality, which seems appropriate, since her book is a steaming mound of yellow horse-dung.

One last thing: If Leo Gursky has written such an important book, why are all of the passages cited halting and puerile?

What we are witnessing is the “dumbing-down” of literary fiction.  We need a new constructivism (I do not use this word in its traditional sense), after three decades of infantilism in American letters.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

TREE OF CODES by Jonathan Safran Foer / WRITING WITH SCISSORS – by Dr. Joseph Suglia

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION.

WRITING WITH SCISSORS: A review of TREE OF CODES (Jonathan Safran Foer)

by Dr. Joseph Suglia

“[Schulz’s] writing is so unbelievably good, so much better than anything that could conceivably be done with it [?], that more often than not I simply wanted to leave it alone.”
–Jonathan Safran Foer

He should have left it alone.

What does one do if one wishes to become a writer but lacks verbal talent?  If one is Jonathan Safran Foer, one mutes and mutilates magical masterpieces.  Tree of Codes (2010) is an anti-book, assaulting language, crushing words under the weight of optical imagery, a non-book in which words serve a merely ornamental function.  It is an atomic weapon that is pitted against verbality, against writing, against the Word.  It is the stifling of a book, a sequence of stillnesses.  There is more writing–more expressive language–in Max Ernst’s collage novels.

To construct this monstrosity, Foer took an English translation of Bruno Schulz’s magisterial Sklepy Cynamonowe (“Cinnamon Shops,” 1934).  (Please note: The book is NOT called “The Street of Crocodiles,” no matter what Foer might tell you.)  Foer then carved blocks of text out of the English translation, excising Schulz’s beautiful prose poetry, scissoring it up.  Anyone who finds this practice innovative should consult the work of Tristan Tzara, Brion Gysin, and Raymond Queneau.

Here are two of Foer’s vicious eviscerations:

“The demands were made more loudly, we heard him talk to God, as if begging against insistent claims” (28).

“Knot by knot he loosened himself, as unremarked as the grey heap swept into a corner waiting to be taken” (39).

SPAM poetry.

Refrigerator-magnet poetry.

The first problem with Foer’s cut-up is that he chooses the wrong object.  Knock, knock!  Schulz wrote in Polish, not in English.  What on Earth is the point of cutting up, mucking up, mashing up, and rescrambling the English translation of a Polish novel?  Polish is frightfully difficult to render into English.  If you would like evidence for this assertion, take a look at any English translation of Jan Potocki, Bruno Schulz, Stanislaw Ignacy Witkiewicz, or Witold Gombrowicz.  Consider, for instance, Alastair Hamilton’s translation of Gombrowicz’s Pornografia.  Hamilton translated a French translation of the novel into English: His is the translation of a translation.

Secondly: The ingenious Bruno Schulz–a writer more gifted than Kafka, in my estimation–did not have to dazzle his readers with glistening typographies.  He let language do the work.  He let his beautiful prose speak for itself.  If Schulz’s book is the richest book Foer ever read (it is one of the richest books I’ve yet read), why disembowel all of that richness?  We know the answer: Because Foer feels condemned by the richness, threatened by the richness, punished by the richness.  Foer the Hipster, who is incapable of expressing himself inventively in writing, chainsawed the work of a great author, an author who intimidated him.  Foer’s venomous envy and hatred of Schulz are unmistakable.

Snip, snip, snip!  Pare it down!  Tear it up!  What we are left with is an absolute abomination, something far worse than a book burning.  It is one thing to immolate a great book such as “Cinnamon Shops.”  It is quite another to replace a great book with a papier-mâché dummy, an Ersatz effigy, a kitschy replica.  Nothing more malicious in the literary arts could be imagined.

In the republic of letters, Jonathan Safran Foer will be remembered as a slicer, shearer, and shredder of literature.  He is at home in a culture that is tawdry, boring, and stupid and that is becoming tawdrier, more boring, and more stupid by the day.

Dr. Joseph Suglia

Postscript: STREET OF CROCODILES = *TREE* OF C*O**D**ES

*

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

EATING ANIMALS by Jonathan Safran Foer / Is Jonathan Safran Foer a Bad Writer?

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION.

A review of EATING ANIMALS by Jonathan Safran Foer

Mr. and Mrs. Jonathan Safran Foer have made a living by choosing illiterates and children as the narrators of their commercial fiction.  Such a writerly choice alleviates them of the responsibility of writing well.  Now, in his most recent offering, Eating Animals (2009), Mr. Foer writes in his own language for the first time in book-form and still sounds very much like the rather dimwitted narrators of his novelistic fabrications.

Though it never fulfills its promise, Eating Animals belongs to the genre of books that explore the ethics of meat-eating.  Foer claims that his research into food-production has been “enormous” [14] and “comprehensive” [12].  But from a philological point of view, Eating Animals is the scholarly equivalent to animal-compost.  How can the male Foer legitimately write and publish a book on the ethics of carnivory without so much as even mentioning the names of Peter Singer and Charles Patterson?  A peal of thundering silence drowns out these extremely loud and incredibly imposing references.  On Page 258, Foer eschews direct statement, but the point is clear: “It might sound naive to suggest that whether you order a chicken patty or a veggie burger is a profoundly important decision.  Then again, it certainly would have sounded fantastic if in the 1950s you were told that where you sat in a restaurant or on a bus could begin to uproot racism.”  Yes, human-rights are equated to animal-rights, EXACTLY the equation set forward by Peter Singer thirty-four years ago.  It does seem parricidal that no reference to Singer or to Patterson is made.

Even worse, Foer’s handling of sources is suspect.  He name-drops Walter Benjamin, tells us what Benjamin allegedly said, and then neglects to give us the citation-information in the endnotes (he is alluding to, but does not cite Benjamin’s 1934 essay on Franz Kafka).  He implies that Kafka felt “shame” while visiting a Berlin aquarium merely because Benjamin finds shame as a motif in Kafka’s LITERARY work.  He quotes Derrida twice in the book and gives, first, an inapplicable commentary on Derrida’s argument, and, secondly, dispenses with commentary altogether.  In his end note to the Benjamin-Kafka-Derrida passage, Foer writes: “The discussion of Benjamin, Derrida, and Kafka in this section is indebted to conversations with religion professor and critical theorist Aaron Gross” [276].  This discussion, apparently, exonerates Foer of the necessity of reading Benjamin, Derrida, and Kafka himself–and of treating their works with care.

I would never dream of suggesting that Foer should have expatiated on the groundbreaking inclusion of animality in Schopenhauerian philosophy and the exclusion of animality from the Kantian philosophy–that would be effrontery on my part.

The prose-style is not merely bad–it is abusively, appallingly, annoyingly, and aggressively bad.  Foer thinks that to aggravate means “to irritate,” that incredibly means “extremely,” that the plural of food is “foods,” and that inedible is a noun.  To aggravate [etymologically, “to make graver”] should never be used to signify “to irritate” in published prose; incredibly properly means “unbelievably” and only means “extremely” in colloquial language; those who think that the plural of food can EVER be “foods” are semiliterate simpletons and debasers of the English language.  Shall we acquiesce to the mistaken idea that inedible is a noun?  (Edible may be a noun; inedible should never be a noun.)

Is it too much to ask the writer whose second novel was described by The Times as a “work of genius” to pursue his research-questions?  And what ARE, precisely, his research-questions?  After an unhealthful serving of microwaved family-anecdotes (always an easy and smarmy introduction), we get an inkling of what Foer’s point of departure might be, and it is all pretty familiar ground: “I simply wanted to know–for myself and my family–what meat is.  I wanted to know as concretely as possible.  Where does it come from?  How is it produced?  How are animals treated, and to what extent does that matter? What are the economic, social, and environmental effects of eating animals?” [12].  Well, what we get instead are heaps of digitalized information copied and pasted from the internet and fictionalized first-person narratives written from the perspective of animal-rights activists and factory-farmers, the kind of “I-am-my-own-Greek-chorus” meta-fiction one often encounters when teaching first-year Composition at an art-school.  Excise the persona-poetry, and you have a pamphlet.

It is only at the book’s premature climax that we come by something resembling a thesis.  Foer endorses “eating with care.”  Despite what he says, Foer does not “argue” for this position.  Nor does he even explain it.  He simply advocates what seems a fairly anodyne stance.  He advocates vegetarianism and “another, wiser animal agriculture” and “more honorable omnivory” [244], without telling us what either of these last-mentioned things might be.  Don’t carnify your comestibles!: That is the extent of the “argument,” such as it is.

There is nothing revolutionary or special about vegetarianism or hoping that animals will be treated without cruelty.  Vegetarianism is surely good for animals, but does it make of the vegetarian a majestic figure?  If this book is distinctive at all, it is merely because of the prefabricated consensus that surrounds it and the writer’s desperate efforts to persuade everyone that he is holier than the rest of us.  One is reminded, in particular, of an anecdote that Foer tells of two friends who are hungry for hamburgers or for “burgers,” as Foer calls them. One man gives in to the hamburger-impulse; the other refuses to do so, for “there are things more important to him than what he is in the mood for at any given moment” [74; note the masculine pronoun].  In the end, Eating Animals is an auto-hagiography, the memoir of a sacrificer of hamburgers who becomes holy by refusing to give in to his carnivoracity, the story of one man’s relationship to his own viscera.

Dr. Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

EVERYTHING IS ILLUMINATED by Jonathan Safran Foer

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OLD, FEEL FREE TO CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION!

EVERYTHING IS ILLUMINATED by Jonathan Safran Foer

Though I have no idea what he looks like, on paper, Jonathan Safran Foer is a dumpy magician garbed in a tattered black cape with a red velvet underside, waving his hands wildly, brandishing a cane purchased at Woolworth’s, a shabby magician’s hat propped on his balloon-shaped head, forever mugging and attention-grubbing, radiating spittle and a desperate need to be liked, nasalizing the same stale jokes ad infinitum, while the audience laughs wanly and with painful politesse.  His overeager face comes too close to yours, his tongue impending over his lower lip, which is bespattered with saliva.

Consider Foer’s massively popular Everything is Illuminated (2002).  While it is not the worst book that I have ever read, it is easily the smarmiest.  Nearly every page is dripping with dollops of cynically contrived pap, mawkish kitsch that appeals to the child in all of us.  You know, that child who is beguiled easily and who doesn’t know the difference between art and tripe.

The novel is structured according to two temporal continua.  The first continuum is narrated from the perspective of Alexander Perchov, The Loveable Ukrainian Tour Guide of one “Jonathan Safran Foer” (also known in the text as “the hero” and “the ingenious Jew”).  “Foer” is searching for the woman who saved his grandfather from death at the hands of the Nazis.  To create Alex’s language, the writer takes ordinary sentences in English and substitutes certain infelicitous words for more felicitous ones.  This gimmick grows tedious after the first three pages, and nothing, of course, is more uncouth than an American writer who mocks the speech patterns of those who speak English as a foreign tongue.  Alex’s malapropisms, however, are more pleasant to read than “Foer’s” prose in the second continuum, a turgidly narrated history of Trachimbrod, a Ukrainian shtetl, from its foundation in the late eighteenth century until its destruction during the Second World War.

Both continua are interlaced–as the first continuum culminates in the discovery of Trachimbrod by “Foer” and his tour guide, the second culminates in an account of the mass murder of its inhabitants; the fatality of Alexander’s grandfather is superimposed on the fatality of “Foer’s” grandfather, and so forth.  The point, plangently, is that “everything” in the present is “illuminated” by the past.  The alleged “cleverness” of this narrative device escapes this reviewer.

Every one hundred pages or so, a striking passage or sentence emerges from the thick, grey, monotonous mass that surrounds it, a passage or sentence that seems, at first glance, almost profound.  And, on further examination, these profundities reveal themselves as specious banalities.

Let me allude to two examples of Profound Truths in Everything is Illuminated:

“God loves the plagiarist…  God is the original plagiarizer… the creation of man was an act of reflexive plagiarizing; God looted the mirror” [Olive Edition, 185].

In other words, if you paint a portrait of yourself, you are “plagiarizing” yourself.  If you photograph yourself in a mirror, you are “plagiarizing” yourself.  To say that the creation of man was an act of plagiarism is to void the word “plagiarism” of all meaning.  There is, nonetheless, genuine theft in Everything is Illuminated: Foer does God’s work by pilfering the entire final section of David Grossman’s See Under: Love, “The Complete Encyclopedia of Kazik’s Life.”  Foer isn’t so much influenced by Grossman as he is dominated by him.

Another “profound” moment:

“The only thing more painful than being an active forgetter is to be an inert rememberer” [360].

Foer here forgets that active forgetting (a term taken from Nietzsche, aktive Vergesslichkeit) is the same thing as inert remembrance.

Friedrich Schlegel once said of Denis Diderot: Whenever he does something truly brilliant, he congratulates himself on his brilliance.  In my essay on Even Cowgirls Get the Blues, I write the same thing about Tom Robbins.  The term brilliant must be supplanted in the case of Jonathan Safran Foer, however: Whenever he writes something sentimental, Foer congratulates himself on his easy sentimentalism.  It is difficult to sell a crowd-pleasing novel about the Shoah unless everything is sentimentalized.

Dr. Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41!