A Critique of BLINK by Malcolm Gladwell / A Negative Review of BLINK by Malcolm Gladwell

CLICK THE IMAGE ABOVE TO READ MY NOVEL TABLE 41.

An Analysis of BLINK (Malcolm Gladwell) by Joseph Suglia

Malcolm Gladwell’s Blink (2005) is not a meticulously researched book.  Nearly all of its ‘research’ was derived from studies in The Journal of Personality and Social Psychology.  In the book’s Notes (a mere seven pages in length), you will count fifteen references to that journal and a few references to other sources.

It seems appropriate that Gladwell’s research is so slipshod.  After all, Blink is like a war-machine pitted against research in all forms.  There simply isn’t time to investigate and deliberate, after all.  And the more you research, the less you will know.

The more you think, the less you will know.

Blink celebrates and affirms pre-knowledge, the uncritical reflex, the snap judgment, the spur-of-the-moment decision.

Our initial perception of things is always correct, according to Gladwell, unless our minds are led astray by some extraneous matter.  All of us would come to the same conclusions, as long as we were to refine our “thin-slicing” skills. “To thin-slice,” in this context, means to extract the salient meaning from an initial impression.  All of us are afforded an immediate and direct insight into the atemporal essences of things.

All of this is ‘argued’ anecdotally.  As I mentioned in the opening of the review, nearly all of the anecdotes were stolen from a single collective source.  And in many cases, misappropriated.  Gladwell tells us that students can instantly judge a teacher’s effectiveness as soon as s/he walks into the classroom.  What Gladwell doesn’t tell us is that the article from which he derived this ‘truth’ concerns the impact of a teacher’s perceived sex-appeal on course-evaluations.

How the ‘glimpse’ actually works is never explained; we are told, in several places, that instantaneous intuition “bubbles up” unbidden from the recesses of the “adaptive unconscious.”  “The” adaptive unconscious, mind you, as if there could only be one.  This is, of course, monism, and Gladwell believes in absolutes.

Of course, one’s initial impressions might yield profitable results.  But to say that one’s immediate intuition of the world is inherently superior to slow and careful thinking is madness.  One should beware of any form of mysticism, and Gladwell’s blank intuitionism could easily be put in the service of a fascistic Wille zur Macht.

Blink’s target audience is composed of Hollywood producers, literary agents, advertisers, and military strategists.  You will learn in this book that films that exhibit Tom Hanks are superior to those that do not, that margarine tastes better when packaged in foil, that music sounds better when marketed the right way to the right people, that military strikes should be carried out without discipline or forethought.  The surface-impression is everything.  Submit to your impulses!

Blink is American pop-culture’s defense of its own stupidity.

Joseph Suglia

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OLD, FEEL FREE TO READ MY NOVEL WATCH OUT.  CLICK BELOW:

Three Aperçus: On DEADPOOL (2016), David Foster Wallace, and Beauty

CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION!

Three Aperçus: On DEADPOOL (2016), David Foster Wallace, and Beauty

by Joseph Suglia

Deadpool (2016) is capitalism with a smirking face.

David Foster Wallace was not even a bad writer.

Beauty is the one sin that the Average American of today cannot forgive.

*

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

Two Aperçus: THE NEON DEMON (2016)

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION.

Two aperçus

The Neon Demon (2016) is a snuff film in which art is murdered.

Descent (2007) is superior to The Neon Demon because the former has an Aristotelian structure–which works.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

 

David Foster Wallace and Macaulay Culkin: Two aperçus

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK THE IMAGE ABOVE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION.

David Foster Wallace and Macaulay Culkin: Two aperçus

David Foster Wallace was a sudorific pseudo-author.

Macaulay Culkin only holds one thing in common with the young Lou Reed: a heroin addiction.

Joseph Suglia

CLICK THE IMAGE BELOW TO READ MY MASTERPIECE TABLE 41:

A Critique of David Foster Wallace: Part Two: A Supposedly Fun Thing That I Will Never Do Again / “E Unibus Pluram: Television and U.S. Fiction” / “Getting Away from Already Being Pretty Much Away from It All” / “David Lynch Keeps His Head”

TO READ MY NOVEL TABLE 41, CLICK THE IMAGE ABOVE.

An Analysis of A SUPPOSEDLY FUN THING THAT I WILL NEVER DO AGAIN (David Foster Wallace) by Joseph Suglia

I have written it before, and I will write it again: Writing fictionally was not one of David Foster Wallace’s gifts.  His métier was, perhaps, mathematics.  David Foster Wallace was a talented theorist of mathematics, it is possible (I am unqualified to judge one’s talents in the field of mathematics), but an absolutely dreadful writer of ponderous fictions (I am qualified to judge one’s talents in the field of literature).

Wallace’s essay-aggregate A Supposedly Fun Thing that I Will Never Do Again (1997) is worth reading, if one is an undiscriminating reader, but it also contains a number of vexing difficulties that should be addressed.  I will focus here upon the two essays to which I was most attracted: “E Unibus Pluram: Television and U.S. Fiction” and “David Lynch Keeps His Head,” a conspectus on the director’s cinema from Eraserhead (1977) until Lost Highway (1997).  Wallace seems unaware of Lynch’s work before 1977.

In “E Unibus Pluram,” Wallace warmly defends the Glass Teat in the way that only an American can.  He sees very little wrong with television, other than the fact that it can become, in his words, a “malignant addiction,” which does not imply, as Wallace takes pains to remind us, that it is “evil” or “hypnotizing” (38).  Perish the thought!

Wallace exhorts American writers to watch television.  Not merely should those who write WATCH television, Wallace contends; they should ABSORB television.  Here is Wallace’s inaugural argument (I will attempt to imitate his prose):

1.) Writers of fiction are creepy oglers.

2.) Television allows creepy, ogling fiction-writers to spy on Americans and draw material from what they see.

3.) Americans who appear on television know that they are being seen, so this is scopophilia, but not voyeurism in the classical sense. [Apparently, one is spying on average Americans when one watches actors and actresses on American television.]

4.) For this reason, American writers can spy on other Americans without feeling uncomfortable and without feeling that what they’re doing is morally problematical.

Wallace: “If we want to know what American normality is – i.e. what Americans want to regard as normal – we can trust television… [W]riters can have faith in television” (22).

“Trust what is familiar!” in other words.  “Embrace what is in front of you!” to paraphrase.  Most contemporary American writers grew up in the lambent glow of the cathode-ray tube, and in their sentences the reader can hear the jangle and buzz of television.  David Foster Wallace was wrong.  No, writers should NOT trust television.  No, they should NOT have faith in the televisual eye, the eye that is seen but does not see.  The language of television has long since colonized the minds of contemporary American writers, which is likely why David Foster Wallace, Chuck Klosterman, and Jonathan Safran Foer cannot focus on a single point for more than a paragraph, why Thomas Pynchon’s clownish, jokey dialogue sounds as if it were culled from Gilligan’s Island, and why Don DeLillo’s portentous, pathos-glutted dialogue sounds as if it were siphoned from Dragnet.

There are scattershot arguments here, the most salient one being that postmodern fiction canalizes televisual waste.  That is my phrasing, not Wallace’s.  Wallace writes, simply and benevolently, that television and postmodern fiction “share roots” (65).  He appears to be suggesting that they both sprang up at exactly the same time.  They did not, of course.  One cannot accept Wallace’s argument without qualification.  To revise his thesis: Postmodern fiction–in particular, the writings of Leyner, DeLillo, Pynchon, Barth, Apple, Barthelme, and David Foster Wallace–is inconceivable outside of a relation to television.  But what would the ontogenesis of postmodern fiction matter, given that these fictions are anemic, execrably written, sickeningly smarmy, cloyingly self-conscious, and/or forgettable?

It did matter to Wallace, since he was a postmodernist fictionist.  Let me enlarge an earlier statement.  Wallace is suggesting (this is my interpretation of his words): “Embrace popular culture, or be embraced by popular culture!”  The first pose is that of a hipster; the second pose is that of the Deluded Consumer.  It would be otiose to claim that Wallace was not a hipster, when we are (mis)treated by so many hipsterisms, such as: “So then why do I get the in-joke? Because I, the viewer, outside the glass with the rest of the Audience, am IN on the in-joke” (32).  Or, in a paragraph in which he nods fraternally to the “campus hipsters” (76) who read him and read (past tense) Leyner: “We can resolve the problem [of being trapped in the televisual aura] by celebrating it.  Transcend feelings of mass-defined angst [sic] by genuflecting to them.  We can be reverently ironic” (Ibid.).  Again, he appears to be implying: “Embrace popular culture, or be embraced by popular culture!”  That is your false dilemma.  If you want others to think that you are special (every hipster’s secret desire), watch television with a REVERENT IRONY.  Wallace’s hipper-than-thou sanctimoniousness is smeared over every page.

Now let me turn to the Lynch essay, the strongest in the collection.  There are several insightful remarks here, particularly Wallace’s observation that Lynch’s cinema has a “clear relation” (197) to Abstract Expressionism and the cinema of German Expressionism.  There are some serious weaknesses and imprecisions, as well.

Wallace: “Except now for Richard Pryor, has there ever been even like ONE black person in a David Lynch movie? … I.e. why are Lynch’s movies all so white? … The likely answer is that Lynch’s movies are essentially apolitical” (189).

To write that there are no black people in Lynch’s gentrified neighborhood is to display one’s ignorance.  The truth is that at least one African-American appeared in the Lynchian universe before Lost Highway: Gregg Dandridge, who is very much an African-American, played Bobbie Ray Lemon in Wild at Heart (1990).  Did Wallace never see this film?  How could Wallace have forgotten the opening cataclysm, the cataclysmic opening of Wild at Heart?  Who could forget Sailor Ripley slamming Bobbie Ray Lemon’s head against a staircase railing and then against a floor until his head bursts, splattering like a splitting pomegranate?

To say that Lynch’s films are apolitical is to display one’s innocence.  No work of art is apolitical, because all art is political.  How could Wallace have missed Lynch’s heartlandish downhomeness?  How could he have failed to notice Lynch’s repulsed fascination with the muck and the slime, with the louche underworld that lies beneath the well-trimmed lawns that line Lynch’s suburban streets?  And how could he have failed to draw a political conclusion, a political inference, from this repulsed fascination, from this fascinated repulsion?

Let me commend these essays to the undiscriminating reader, as unconvincing as they are.  Everything collected here is nothing if not badly written, especially “Getting Away from Already Being Pretty Much Away from It All,” a hipsterish pamphlet about Midwestern state fairs that would not have existed were it not for David Byrne’s True Stories (1986), both the film and the book.  It is my hope that David Foster Wallace will someday be remembered as the talented mathematician he perhaps was and not as the brilliant fictioneer he certainly was not.

Joseph Suglia

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OLD AND WOULD LIKE TO READ MY NOVEL WATCH OUT: THE FINAL VERSION, CLICK THE IMAGE BELOW.

A Critique / Refutation of OUTLIERS by Malcolm Gladwell

CLICK THE IMAGE ABOVE TO READ MY NOVEL TABLE 41.

A Critique of OUTLIERS (Malcolm Gladwell) by Dr. Joseph Suglia

According to Nietzsche, Kant writes what the common man believes in a language that the common man cannot understand.  Malcolm Gladwell, it must be said, vigorously reaffirms what the common man believes in a language that the common man CAN understand, thus flattering the common man and “making him happy.”  “To be made happy”: a Gladwellism for “to be satisfied with a consumer item, such as a book by Malcolm Gladwell.”

In Outliers (2008), Gladwell argues, in essence: “It is better to be mediocre than it is to be brilliant!”  Perhaps that is too blunt of a truncation, but the book seems to welcome such simplicity.

We are introduced to Chris Langen, “the public face of genius in American life” [70], who nonetheless works in construction and “despairs of ever getting published in a scholarly journal” [95].  Langen fails because he was raised in abject squalor, and his mother “missed a deadline for his financial aid” [98].  By contrast, Robert Oppenheimer, a “success” for his complicity in the atomization of Hiroshima and Nagasaki, was “raised in one of the wealthiest neighborhoods in Manhattan” [108].  Other actors within the community-theater proscenium include Marita, a twelve-year-old from an impoverished family who gives up her evenings, weekends, and friends to slave away in one of New York City’s most rigorous and competitive middle schools.  She will succeed, Gladwell suggests, because she “works hard” and is given a “chance.”  Indeed, Bill Gates was a “success” because he was given unlimited access to a time-sharing terminal at the age of thirteen.  The Beatles were a “success” because they forced themselves to perform eight-hour concerts in Hamburg between 1960 and 1962.  Along the way, the reader is pepper-sprayed with anecdotes about Korean aviation and Kentuckian aggression that have no apparent relevance to the thesis of the book, except to “demonstrate” that one’s “cultural legacy” sometimes has to be jettisoned in order for one to become “successful.”

Gladwell is arguing, in nuce, that success–euphemistic for “financial prosperity”–corresponds not to one’s intelligence, but rather to opportunity and social savoir-faire.  The thesis isn’t so much false as it is banal.  Of course, one must have social skills and opportunity to be “successful.”  And yet I would contend, pace Gladwell, that even social skills and opportunity are not enough, by themselves, for an individual to succeed financially.  Life never brooks such easy recipes (or follows such “predictable courses” [267], to use Gladwell’s language).

What, precisely, does Gladwell mean by “intelligence”?  The author hypostatizes the Intelligence Quotient Test and thus subscribes to the false supposition that intelligence can be quantified and measured.  If you receive 180 on the Intelligence Quotient Test, in other words, then you are a super-genius.  Now, I did score [number redacted] on the I. Q. Test, but that, in itself, is no guarantor of my genius.  Intelligence is an impalpable thing, and there is no necessary relationship whatsoever between one’s intelligence and the I. Q. examination, just as, following Gladwell, there is no necessary relationship between one’s I. Q. score and “success.”

Moreover, Gladwell ignores the temporal differences that separate his stories.  Oppenheimer lived in an America that was less intimidated by, and envious of, intelligence than the America of the twenty-first century.  I differ from Gladwell, and my counter-thesis is the following: Even if Langen possessed superior social skills, it is very likely that he still would have failed in life.

Why?  Because the culture has become a home for Swiftian Lilliputians, ever-ready to manacle down any Gulliver who comes their way.  Yes, Gladwell is correct in suggesting that geniuses almost always fail and the mediocre almost always triumph, but he completely misses the reasons.  You cannot possibly succeed if you are a genius unless you camouflage, to a certain extent, your intelligence.  We are living a culture that, instead of lionizing intelligence, disdains it.  Those who possess a higher intellect than the multitude are looked upon with acrimony and mistrust.  Such is the “leveling-off” or equalization of all distinction to which polymaths and geniuses have long since grown accustomed.

Similarly, there is the impulse in this book to anathematize genius, as if genius were some kind of cancerous polyp that should be excised.  It is not difficult to detect a certain defensiveness in Gladwell’s anti-intellectualist posturing, not merely as if the myth that genius equals success needed to be debunked, but as if genius, in itself, were something intrinsically negative, threatening–damaging, even.  Gladwell, non-genius, is content to attack genius in Outliers with the same vehemence with which he attacked critical thinking in Blink.  And for exactly the same affective reason: Gladwell is as intimidated by genius as he is cowed by critical thought, for which he substitutes anecdotes lifted, quite uncritically, from single sources: books by John Ed Pierce, Richard E. Niebett and Dov Cohen, Kai Bird and Martin J. Sherwin…

Gladwell’s most ardent admirers–non-brilliant readers who want reassurance that their non-brilliance is a formula for success–sigh plaintively and bleat.  And the mediocre shall inherit the Earth.

Concluding Unscientific Postscript. Niccolo Machiavelli argued that the expansion of power comes from opportunity in the early sixteenth century.  But he qualified: from opportunity and through cleverness (virtù in Italian).

Joseph Suglia

IF YOU ARE AT LEAST TWENTY-EIGHT (28) YEARS OF AGE, CLICK BELOW TO READ MY NOVEL WATCH OUT: THE FINAL VERSION: