Thomas Wolfe Essays

Not to be confused with Thomas Wolfe or Tom Wolf.

Tom Wolfe

Wolfe at the White House in 2004

BornThomas Kennerly Wolfe Jr.
(1931-03-02) March 2, 1931 (age 87)
Richmond, Virginia, U.S.
OccupationJournalist, author
LanguageEnglish
NationalityAmerican
Period1959–present
Literary movementNew Journalism
Notable worksThe Painted Word, The Electric Kool-Aid Acid Test, The Right Stuff, A Man in Full, Radical Chic & Mau-Mauing the Flak Catchers, The Bonfire of the Vanities, I Am Charlotte Simmons, Back to Blood
SpouseSheila Wolfe
Children2

Thomas Kennerly Wolfe Jr. (born March 2, 1931)[1] is an American author and journalist, best known for his association with and influence in stimulating the New Journalism literary movement, in which literary techniques are used extensively. He reduced traditional values of journalistic objectivity.

He began his career as a regional newspaper reporter in the 1950s, but achieved national prominence in the 1960s following the publication of such best-selling books as The Electric Kool-Aid Acid Test (a highly experimental account of Ken Kesey and the Merry Pranksters), and two collections of articles and essays, Radical Chic & Mau-Mauing the Flak Catchers and The Kandy-Kolored Tangerine-Flake Streamline Baby.

His first novel, The Bonfire of the Vanities, published in 1987, was met with critical acclaim, and also became a commercial success. It was adapted as a major motion picture of the same name, directed by Brian De Palma.

Early life and education[edit]

Wolfe was born in Richmond, Virginia, the son of Louise (née Agnew), a landscape designer, and Thomas Kennerly Wolfe Sr., an agronomist.[2][3]

Wolfe grew up on Gloucester Road in the historic Richmond North Side neighborhood of Sherwood Park. He recounts childhood memories in a foreword to a book about the nearby historic Ginter Park neighborhood.

Wolfe was student council president, editor of the school newspaper, and a star baseball player at St. Christopher's School, an Episcopal all-boys school in Richmond.

Upon graduation in 1947, he turned down admission to Princeton University to attend Washington and Lee University, both all-male schools at the time. At Washington and Lee, Wolfe was a member of the Phi Kappa Sigma fraternity. Wolfe majored in English and practiced his writing outside the classroom as well. He was the sports editor of the college newspaper and helped found a literary magazine, Shenandoah. Of particular influence was his professor Marshall Fishwick, a teacher of American studies educated at Yale. More in the tradition of anthropology than literary scholarship, Fishwick taught his classes to look at the whole of a culture, including those elements considered profane.[citation needed] Wolfe's undergraduate thesis, entitled "A Zoo Full of Zebras: Anti-Intellectualism in America," evinced his fondness for words and aspirations toward cultural criticism. Wolfe graduated cum laude in 1951.

Wolfe had continued playing baseball as a pitcher and had begun to play semi-professionally while still in college. In 1952 he earned a tryout with the New York Giants but was cut after three days, which Wolfe blamed on his inability to throw good fastballs. Wolfe abandoned baseball and instead followed his professor Fishwick's example, enrolling in Yale University's American studies doctoral program. His PhD thesis was titled The League of American Writers: Communist Organizational Activity Among American Writers, 1929–1942.[4] In the course of his research, Wolfe interviewed many writers, including Malcolm Cowley, Archibald MacLeish, and James T. Farrell.[5] A biographer remarked on the thesis: "Reading it, one sees what has been the most baleful influence of graduate education on many who have suffered through it: it deadens all sense of style."[6] His thesis was originally rejected but it was finally accepted after he rewrote it in an objective rather than a subjective style. Upon leaving Yale, he wrote a friend explaining through expletives his personal opinions about his thesis.

Journalism and New Journalism[edit]

Though Wolfe was offered teaching jobs in academia, he opted to work as a reporter. In 1956, while still preparing his thesis, Wolfe became a reporter for the Springfield Union in Springfield, Massachusetts. Wolfe finished his thesis in 1957.

In 1959 he was hired by The Washington Post. Wolfe has said that part of the reason he was hired by the Post was his lack of interest in politics. The Post's city editor was "amazed that Wolfe preferred cityside to Capitol Hill, the beat every reporter wanted." He won an award from The Newspaper Guild for foreign reporting in Cuba in 1961 and also won the Guild's award for humor. While there, Wolfe experimented with fiction-writing techniques in feature stories.[7]

In 1962, Wolfe left Washington for New York City, taking a position with the New York Herald Tribune as a general assignment reporter and feature writer. The editors of the Herald Tribune, including Clay Felker of the Sunday section supplement New York magazine, encouraged their writers to break the conventions of newspaper writing.[8] During the 1962 New York City newspaper strike, Wolfe approached Esquire magazine about an article on the hot rod and custom car culture of Southern California. He struggled with the article until his editor, Byron Dobell, suggested that Wolfe send him his notes so they could piece the story together.

Wolfe procrastinated. The evening before the deadline, he typed a letter to Dobell explaining what he wanted to say on the subject, ignoring all journalistic conventions. Dobell's response was to remove the salutation "Dear Byron" from the top of the letter and publish it intact as reportage. The result, published in 1963, was "There Goes (Varoom! Varoom!) That Kandy-Kolored Tangerine-Flake Streamline Baby." The article was widely discussed—loved by some, hated by others. Its notoriety helped Wolfe gain publication of his first book, The Kandy-Kolored Tangerine-Flake Streamline Baby, a collection of his writings from the Herald-Tribune, Esquire, and other publications.[9]

This was what Wolfe called New Journalism, in which some journalists and essayists experimented with a variety of literary techniques, mixing them with the traditional ideal of dispassionate, even-handed reporting. Wolfe experimented with four literary devices not normally associated with feature writing: scene-by-scene construction, extensive dialogue, multiple points of view, and detailed description of individuals' status-life symbols (the material choices people make) in writing this stylized form of journalism. He later referred to this style as literary journalism.[10] Of the use of status symbols, Wolfe has said, "I think every living moment of a human being’s life, unless the person is starving or in immediate danger of death in some other way, is controlled by a concern for status."[11]

Wolfe also championed what he called “saturation reporting,” a reportorial approach in which the journalist “shadows” and observes the subject over an extended period of time. “To pull it off,” says Wolfe, “you casually have to stay with the people you are writing about for long stretches . . . long enough so that you are actually there when revealing scenes take place in their lives.”[12] Saturation reporting differs from “in-depth” and “investigative” reporting, which involve the direct interviewing of numerous sources and/or the extensive analyzing of external documents relating to the story. Saturation reporting, according to communication professor Richard Kallan, “entails a more complex set of relationships wherein the journalist becomes an involved, more fully reactive witness, no longer distanced and detached from the people and events reported.”[13]

Wolfe's The Electric Kool-Aid Acid Test is considered a striking example of New Journalism. This account of the Merry Pranksters, a famous sixties counter-culture group, was highly experimental in Wolfe's use of onomatopoeia, free association, and eccentric punctuation—such as multiple exclamation marks and italics—to convey the manic ideas and personalities of Ken Kesey and his followers.

In addition to his own work, Wolfe edited a collection of New Journalism with E.W. Johnson, published in 1973 and titled The New Journalism. This book published pieces by Truman Capote, Hunter S. Thompson, Norman Mailer, Gay Talese, Joan Didion, and several other well-known writers, with the common theme of journalism that incorporated literary techniques and that could be considered literature.[14]

Non-fiction books[edit]

In 1965, Wolfe published a collection of his articles in this style, The Kandy-Kolored Tangerine-Flake Streamline Baby, adding to his notability. He published a second collection of articles, The Pump House Gang, in 1968. Wolfe wrote on popular culture, architecture, politics, and other topics that underscored, among other things, how American life in the 1960s had been transformed by post-WWII economic prosperity. His defining work from this era is The Electric Kool-Aid Acid Test (published the same day as The Pump House Gang in 1968), which for many epitomized the 1960s. Although a conservative in many ways (in 2008, he claimed never to have used LSD and to have tried marijuana only once[15]) Wolfe became one of the notable figures of the decade.

In 1970, he published two essays in book form as Radical Chic & Mau-Mauing the Flak Catchers. "Radical Chic" was a biting account of a party given by composer and conductor Leonard Bernstein to raise money for the Black Panther Party. "Mau-Mauing The Flak Catchers" was about the practice by some African Americans of using racial intimidation ("mau-mauing") to extract funds from government welfare bureaucrats ("flak catchers"). Wolfe's phrase, "radical chic", soon became a popular derogatory term for critics to apply to upper-class leftism. His Mauve Gloves & Madmen, Clutter & Vine (1977) included Wolfe's noted essay, "The Me Decade and the Third Great Awakening."

In 1979, Wolfe published The Right Stuff, an account of the pilots who became America's first astronauts. Following their training and unofficial, even foolhardy, exploits, he likened these heroes to "single combat champions" of a bygone era, going forth to battle in the space race on behalf of their country. In 1983, the book was adapted as a successful feature film.

In 2016 Wolfe published The Kingdom of Speech, a controversial critique of the work of Charles Darwin and Noam Chomsky. His take on how humans developed speech was described as opinionated and not supported by research.[16][17]

Art critiques[edit]

Wolfe also wrote two critiques of and social histories of modern art and modern architecture, The Painted Word and From Bauhaus to Our House, published in 1975 and 1981, respectively. The Painted Word mocked the excessive insularity of the art world and its dependence on what he saw as faddish critical theory. In From Bauhaus to Our House he explored what he said were the negative effects of the Bauhaus style on the evolution of modern architecture.[18]

Made for TV movie[edit]

In 1977 PBS produced Tom Wolfe's Los Angeles, a fictional, satirical TV movie set in Los Angeles. Wolfe appears in the movie as himself.[19]

Novels[edit]

Throughout his early career, Wolfe had planned to write a novel to capture the wide reach of American society. Among his models was William Makepeace Thackeray's Vanity Fair, which described the society of 19th-century England. In 1981, he ceased his other work to concentrate on the novel.

Wolfe began researching the novel by observing cases at the Manhattan Criminal Court and shadowing members of the Bronx homicide squad. While the research came easily, he encountered difficulty in writing. To overcome his writer's block, Wolfe wrote to Jann Wenner, editor of Rolling Stone, to propose an idea drawn from Charles Dickens and Thackeray: to serialize his novel. Wenner offered Wolfe around $200,000 to serialize his work.[20] The frequent deadline pressure gave him the motivation he had hoped for, and from July 1984 to August 1985, he published a new installment in each biweekly issue of Rolling Stone.

Later Wolfe was unhappy with his "very public first draft"[21] and thoroughly revised his work, even changing his protagonist Sherman McCoy. Wolfe had originally made him a writer but recast him as a bond salesman. Wolfe researched and revised for two years, and his The Bonfire of the Vanities was published in 1987. The book was a commercial and critical success, spending weeks on bestseller lists and earning praise from the very literary establishment on which Wolfe had long heaped scorn.[22]

Because of the success of Wolfe's first novel, there was widespread interest in his second. This novel took him more than 11 years to complete; A Man in Full was published in 1998. The book's reception was not universally favorable, though it received glowing reviews in Time, Newsweek, The Wall Street Journal, and elsewhere. An initial printing of 1.2 million copies was announced and the book stayed at number one on the New York Times bestseller list for ten weeks. Noted author John Updike wrote a critical review for The New Yorker, complaining that the novel "amounts to entertainment, not literature, even literature in a modest aspirant form."[citation needed] His comments sparked an intense war of words in the print and broadcast media among Wolfe and Updike, and authors John Irving and Norman Mailer, who also entered the fray.

In 2001, Wolfe published an essay referring to these three authors as "My Three Stooges."[citation needed] That year he also published Hooking Up (a collection of short pieces, including the 1997 novella Ambush at Fort Bragg). ,

He published his third novel, I Am Charlotte Simmons (2004), chronicling the decline of a poor, bright scholarship student from Alleghany County, North Carolina, after attending an elite university. He conveys an institution filled with snobbery, materialism, anti-intellectualism, and sexual promiscuity. The novel met with a mostly tepid response by critics. Many social conservatives praised it in the belief that its portrayal revealed widespread moral decline. The novel won a Bad Sex in Fiction Award from the London-based Literary Review, a prize established "to draw attention to the crude, tasteless, often perfunctory use of redundant passages of sexual description in the modern novel".[citation needed] Wolfe later explained that such sexual references were deliberately clinical.[citation needed]

Wolfe has written that his goal in writing fiction is to document contemporary society in the tradition of John Steinbeck, Charles Dickens, and Émile Zola.

Wolfe announced in early 2008 that he was leaving his longtime publisher, Farrar, Straus and Giroux. His fourth novel, Back to Blood, was published in October 2012 by Little, Brown. According to The New York Times, Wolfe was paid close to US$7 million for the book.[23] According to the publisher, Back to Blood is about "class, family, wealth, race, crime, sex, corruption and ambition in Miami, the city where America's future has arrived first."[24]

Recurring themes[edit]

Wolfe revisits particular themes in both his non-fiction writing and novels. He explores male power-jockeying, in The Bonfire of the Vanities, A Man in Full, and I Am Charlotte Simmons, as well as several of his journalistic pieces. Male characters in his fiction often suffer from feelings of extreme inadequacy or hugely inflated egos, sometimes alternating between both.

He satirizes racial politics, most commonly between whites and blacks; he also highlights class divisions between characters. He describes men's fashions to indicate economic status.

Much of his recent work also addresses neuroscience. He notes his fascination in "Sorry, Your Soul Just Died," one of the essays in Hooking Up. This topic is also featured in I Am Charlotte Simmons, as the title character is a student of neuroscience. Wolfe describes the characters' thought and emotional processes, such as fear, humiliation and lust, in the clinical terminology of brain chemistry. Wolfe also frequently gives detailed descriptions of various aspects of his characters' anatomies.[25]

Two of his novels (A Man in Full and I Am Charlotte Simmons) feature major characters (Conrad Hensley and Jojo Johanssen, respectively) who begin paths to self-discovery by reading classical Roman and Greek philosophy.

Wolfe names law and banking firms satirically, formed by the surnames of the partners. For instance, "Dunning, Sponget and Leach" and "Curry, Goad and Pesterall" appear in The Bonfire of the Vanities, and "Wringer, Fleasom and Tick" in A Man in Full. Ambush at Fort Bragg contains a law firm called "Crotalus, Adder, Cobran and Krate" (all names or homophones of venomous snakes).

Some characters appear in multiple novels, creating a sense of a "universe" that is continuous throughout Wolfe's fiction. The character of Freddy Button, a lawyer from Bonfire of the Vanities, is mentioned briefly in I Am Charlotte Simmons. A character named Ronald Vine, an interior decorator who is mentioned in The Bonfire of the Vanities, reappears in A Man in Full as the designer of Charlie Croker's home.

He describes a fictional sexual practice, called "that thing with the cup", both in novels and a non-fiction essay in Hooking Up.

White suit[edit]

Wolfe adopted wearing a white suit as a trademark in 1962. He bought his first white suit, planning to wear it in the summer, in the style of Southern gentlemen. However, he found that the suit he purchased was too heavy for summer use, so he wore it in winter, which created a sensation. At the time, white suits were supposed to be reserved for summer wear.[26] Wolfe has maintained this as a trademark ever since. He sometimes accompanies it with a white tie, white homburg hat, and two-tone shoes. Wolfe has said that the outfit disarms the people he observes, making him, in their eyes, "a man from Mars, the man who didn't know anything and was eager to know."[27]

Views[edit]

In 1989, Wolfe wrote an essay for Harper's Magazine titled "Stalking the Billion-Footed Beast". It criticized modern American novelists for failing to engage fully with their subjects, and suggested that modern literature could be saved by a greater reliance on journalistic technique.[28] This attack on the mainstream literary establishment was interpreted as a boast that Wolfe's work was superior to that of more highly regarded authors.[citation needed]

Wolfe supported George W. Bush as a political candidate and said he voted for him for president in 2004 because of what he called Bush's "great decisiveness and willingness to fight."[citation needed] (Bush apparently reciprocates the admiration, and is said to have read all of Wolfe's books, according to friends in 2005).[29]) After his support of Bush was publicized in a New York Times interview,[citation needed] Wolfe said that the reaction in the literary world was as if he had said, "I forgot to tell you—I'm a child molester."[citation needed] Because of this incident, he sometimes wears an American flag pin on his suit, which he compared to "holding up a cross to werewolves."[30]

Wolfe's views and choice of subject material, such as mocking left-wing intellectuals in Radical Chic and glorifying astronauts in The Right Stuff, have sometimes resulted in his being labeled conservative.[31] Due to his depiction of the Black Panther Party in Radical Chic, a member of the party called him a racist.[32] Wolfe rejects such labels. In a 2004 interview in The Guardian, he said that his "idol" in writing about society and culture is Émile Zola. Wolfe described him as "a man of the left"; one who "went out, and found a lot of ambitious, drunk, slothful and mean people out there. Zola simply could not—and was not interested in—telling a lie."[31]

Asked to comment by the Wall Street Journal on blogs in 2007 to mark the tenth anniversary of their advent, Wolfe wrote that "the universe of blogs is a universe of rumors" and that "blogs are an advance guard to the rear."[33] He also took the opportunity to criticize Wikipedia, saying that "only a primitive would believe a word of" it. He noted a story about him in his Wikipedia bio article at the time, which he said had never happened.[33]

Personal life[edit]

Wolfe lives in New York City with his wife Sheila, who designs covers for Harper's Magazine. They have two children: a daughter, Alexandra, and a son, Tommy.[34]

A writer for Examiner Magazine who interviewed Wolfe in 1998 said, "He has no computer and does not surf, or even know how to use, the Internet". He also noted, however, that Wolfe's novel A Man in Full has a subplot involving "a muckraking cyber-gossip site, à la the Drudge Report or Salon."[34]

Influence[edit]

Wolfe is credited with introducing terms like "statusphere," "the right stuff," "radical chic," "the Me Decade," "social x-ray," and "pushing the envelope", into the English lexicon.[35][dubious– discuss] He is sometimes credited with creating the term "trophy wife" as well, but this is incorrect. He described extremely thin women as "X-rays" in his novel The Bonfire of the Vanities, but did not use the term "trophy wife".[36] According to journalism professor Ben Yagoda, Wolfe emphasized writing in the present tense in magazine profile pieces; before he began doing so in the early 1960s, profile articles had always been written in the past tense.[37]

Terms coined by Wolfe[edit]

List of awards and nominations[edit]

  • 1961 Washington Newspaper Guild Award for Foreign News Reporting
  • 1961 Washington Newspaper Guild Award for Humor
  • 1970 Society of Magazine Writers Award for Excellence
  • 1971 D.F.A., Minneapolis College of Art
  • 1973 Frank Luther Mott Research Award
  • 1974 D.Litt., Washington and Lee University
  • 1977 Virginia Laureate for literature
  • 1979 National Book Critics Circle Finalist General Nonfiction Finalist for The Right Stuff
  • 1980 National Book Award for Nonfiction for The Right Stuff[38][a]
  • 1980 Columbia Journalism Award for The Right Stuff
  • 1980 Harold D. Vursell Memorial Award of the American Institute of Arts and Letters
  • 1980 Art History Citation from the National Sculpture Society
  • 1983 L.H.D., Virginia Commonwealth University
  • 1984 L.H.D., Southampton College
  • 1984 John Dos Passos Award
  • 1986 Gari Melchers Medal
  • 1986 Benjamin Pierce Cheney Medal from Eastern Washington University
  • 1986 Washington Irving Medal for Literary Excellence
  • 1987 National Book Critics Circle fiction Finalist for The Bonfire of the Vanities
  • 1987 D.F.A., School of Visual Arts
  • 1988 L.H.D., Randolph-Macon College
  • 1988 L.H.D., Manhattanville College
  • 1989 L.H.D., Longwood College
  • 1990 St. Louis Literary Award from Saint Louis University Library Associates[39][40]
  • 1990 D.Litt., St. Andrews Presbyterian College
  • 1990 D.Litt., Johns Hopkins University
  • 1993 D.Litt., University of Richmond
  • 1998 National Book Award Finalist for A Man in Full[41]
  • 2001 National Humanities Medal
  • 2003 Chicago Tribune Literary Prize for Lifetime Achievement
  • 2004 Bad Sex in Fiction Award from the Literary Review
  • 2005 Academy of Achievement Golden Plate Award
  • 2006 Jefferson Lecture in Humanities
  • 2010 National Book FoundationMedal for Distinguished Contribution to American Letters[42]

Television appearances[edit]

  • Wolfe was featured as an interview subject in the 1987 PBS documentary series Space Flight.
  • In July 1975 Wolfe was interviewed on Firing Line by William F. Buckley Jr., discussing "The Painted Word".[43]
  • Wolfe was featured on the February 2006 episode "The White Stuff" of Speed Channel's Unique Whips, where his Cadillac's interior was customized to match his trademark white suit.[44]
  • Wolfe guest-starred alongside Jonathan Franzen, Gore Vidal and Michael Chabon in The Simpsons episode "Moe'N'a Lisa", which aired November 19, 2006. He was originally slated to be killed by a giant boulder, but that ending was edited out.[45] Wolfe was also used as a sight gag on The Simpsons episode "Insane Clown Poppy", which aired on November 12, 2000. Homer spills chocolate on Wolfe's trademark white suit, and Wolfe rips it off in one swift motion, revealing an identical suit underneath.

Bibliography[edit]

Non-fiction[edit]

Novels[edit]

Featured in[edit]

Notable articles[edit]

  • "The Last American Hero Is Junior Johnson. Yes!" Esquire, March 1965.
  • "Tiny Mummies! The True Story of the Ruler of 43rd Street's Land of the Walking Dead!" New York Herald-Tribune supplement (April 11, 1965).
  • "Lost in the Whichy Thicket," New York Herald-Tribune supplement (April 18, 1965).
  • "The Birth of the New Journalism: Eyewitness Report by Tom Wolfe." New York, February 14, 1972.
  • "The New Journalism: A la Recherche des Whichy Thickets." New York Magazine, February 21, 1972.
  • "Why They Aren't Writing the Great American Novel Anymore." Esquire, December 1972.
  • "The Me Decade and the Third Great Awakening" New York, August 23, 1976.
  • "Stalking the Billion-Footed Beast", Harper's. November 1989.
  • "Sorry, but Your Soul Just Died." Forbes 1996.
  • "Pell Mell." The Atlantic Monthly (November 2007).
  • "The Rich Have Feelings, Too." Vanity Fair (September 2009).

See also[edit]

Notes[edit]

References[edit]

  1. ^Bloom, Harold. Tom Wolfe, Infobase Publishing, 2001, ISBN 0-7910-5916-2, pg. 193.
  2. ^Rolling Stone interview on May 2, 2007 samharris.org (Retrieved November 15, 2008)
  3. ^Weingarten, Marc (January 1, 2006). "The Gang that Wouldn't Write Straight: Wolfe, Thompson, Didion, and the New Journalism Revolution". Crown Publishers – via Google Books. 
  4. ^Available on microform from the Yale University Libraries, Link to Entry
  5. ^Ragen 2002, pp. 6–10
  6. ^Ragen 2002, pp. 9
  7. ^Rosen, James (2006-07-02). "Tom Wolfe's Washington Post". The Washington Post. Retrieved 2007-03-09. 
  8. ^Mclellan, Dennis (July 2, 2008). "Clay Felker, 82; editor of New York magazine led New Journalism charge". Los Angeles Times. Retrieved 2008-11-23. 
  9. ^Ragen 2002, pp. 11–12
  10. ^Wolfe, Tom; E. W. Johnson (1973). The New Journalism. New York: Harper & Row, Publishers. pp. 31–33. ISBN 0-06-014707-5. 
  11. ^"A Guide to the Work of Tom Wolfe". contemporarythinkers.org. 
  12. ^Wolfe, Tom (September 1970). "The New Journalism". Bulletin of American Society of Newspapers: 22. 
  13. ^Kallan, Richard A. (1992). Connery, Thomas B., ed. "Tom Wolfe". A Sourcebook of American Literary Journalism: Representative Writers in an Emerging Genre. New York: Greenwood Press: 252. 
  14. ^Ragen 2002, pp. 19–22
  15. ^"10 Questions for Tom Wolfe". Time. August 28, 2008. Retrieved May 25, 2010. 
  16. ^Coyne, Jerry (2016-08-31). "His white suit unsullied by research, Tom Wolfe tries to take down Charles Darwin and Noam Chomsky". The Washington Post. Retrieved 2016-09-01. 
  17. ^Sullivan, James (2016-08-25). "Tom Wolfe traces the often-amusing history of bickering over how humans started talking". The Boston Globe. Retrieved 2016-08-26. 
  18. ^Ragen 2002, pp. 22–29
  19. ^"Tom Wolfe's Satirical Look at Los Angeles". The Daily News of the Virgin Islands. Daily News Publishing Co., Inc. January 25, 1977. p. 18. Retrieved October 20, 2017 – via Google News Archive. 
  20. ^Ragen 2002, pp. 31
  21. ^Ragen 2002, pp. 32
  22. ^Ragen 2002, pp. 30–34
  23. ^Rich, Motoko. "Tom Wolfe Leaves Longtime Publisher, Taking His New Book", The New York Times, January 3, 2008. Retrieved January 3, 2008.
  24. ^Trachtenberg, Jeffrey A. "Tom Wolfe Changes Scenery; Iconic Author Seeks Lift With New Publisher, Miami-Centered Drama", The Wall Street Journal, January 3, 2008. Retrieved January 3, 2008.
  25. ^"Muscle-Bound". The New Yorker. 15 October 2012. 
  26. ^Ragen 2002, pp. 12
  27. ^Freeman, John (18 December 2004). "In Wolfe's clothing". The Sydney Morning Herald. 
  28. ^Wolfe, Tom (November 1989), "Stalking the Billion-Footed Beast", Harper's Magazine
  29. ^Bumiller, Elisabeth (February 7, 2005), "Bush's Official Reading List, and a Racy Omission", The New York Times. Retrieved May 15, 2010
  30. ^Rago, Joseph (March 11, 2006), "Status Reporter", The Wall Street Journal
  31. ^ abVulliamy, Ed (November 1, 2004), "'The liberal elite hasn't got a clue'", The Guardian
  32. ^Foote, Timothy (December 21, 1970). "Books: Fish in the Brandy Snifter" – via www.time.com. 
  33. ^ abVaradarajan, Tunku (July 14, 2007), "Happy Blogiversary", The Wall Street Journal
  34. ^ abCash, William (November 29, 1998). "Southern Man". San Francisco Chronicle. Hearst Communications. Retrieved December 12, 2015 – via sfgate.com. 
  35. ^Tom Wolfe – Jefferson Lecturer Biography, Meredith Hindley, National Endowment for the Humanities,2006
  36. ^Safire, William (May 1, 1994). "On language; Trophy Wife". The New York Times. 
  37. ^Yagoda, Ben (2007). When You Catch an Adjective, Kill It. Broadway Books. p. 228. ISBN 9780767920773. 
  38. ^"National Book Awards – 1980". National Book Foundation. Retrieved 2012-03-11.
  39. ^"Saint Louis Literary Award". slu.edu. Saint Louis University. 
  40. ^"Recipients of the Saint Louis Literary Award". slu.edu. Saint Louis University Library Associates. Retrieved July 25, 2016. 
  41. ^"National Book Awards – 1998". nationalbook.org. National Book Foundation. Retrieved 2012-03-11. 
  42. ^"Distinguished Contribution to American Letters". nationalbook.org. National Book Foundation. Includes Wolfe's acceptance speech. Retrieved 2012-03-11. 
  43. ^Scura, Dorothy McInnis (January 1, 1990). "Conversations with Tom Wolfe". Univ. Press of Mississippi – via Google Books. 
  44. ^"The White Stuff". March 8, 2006 – via IMDb. 
  45. ^Bond, Corey (November 30, 2005). "Crisis on Infinite Springfields: "Tom Wolfe Is Screaming"". 
  46. ^"About Tom Wolfe". 
  • Bloom, Harold, ed. (2001), Tom Wolfe (Modern Critical Views), Philadelphia: Chelsea House Publishers, ISBN 0-7910-5916-2 
  • McKeen, William. (1995), Tom Wolfe

Sorry, But Your Soul Just Died

Tom Wolfe

From neuroscience to Nietzsche. A sobering look at how man may perceive himself in the future, particularly as ideas about genetic predeterminism takes the place of dying Darwinism. This article was first published in "Forbes ASAP" in 1996.

Being a bit behind the curve, I had only just heard of the digital revolution last February when Louis Rossetto, cofounder of Wired magazine, wearing a shirt with no collar and his hair as long as Felix Mendelssohn's, looking every inch the young California visionary, gave a speech before the Cato Institute announcing the dawn of the twenty–first century's digital civilization. As his text, he chose the maverick Jesuit scientist and philosopher Pierre Teilhard de Chardin, who fifty years ago prophesied that radio, television, and computers would create a "noösphere," an electronic membrane covering the earth and wiring all humanity together in a single nervous system. Geographic locations, national boundaries, the old notions of markets and political processes—all would become irrelevant. With the Internet spreading over the globe at an astonishing pace, said Rossetto, that marvelous modem–driven moment is almost at hand.

Could be. But something tells me that within ten years, by 2006, the entire digital universe is going to seem like pretty mundane stuff compared to a new technology that right now is but a mere glow radiating from a tiny number of American and Cuban (yes, Cuban) hospitals and laboratories. It is called brain imaging, and anyone who cares to get up early and catch a truly blinding twenty–first–century dawn will want to keep an eye on it.

Brain imaging refers to techniques for watching the human brain as it functions, in real time. The most advanced forms currently are three–dimensional electroencephalography using mathematical models; the more familiar PET scan (positron–emission tomography); the new fMRI (functional magnetic resonance imaging), which shows brain blood–flow patterns, and MRS (magnetic resonance spectroscopy), which measures biochemical changes in the brain; and the even newer PET reporter gene/PET reporter probe, which is, in fact, so new that it still has that length of heavy lumber for a name. Used so far only in animals and a few desperately sick children, the PET reporter gene/PET reporter probe pinpoints and follows the activity of specific genes. On a scanner screen you can actually see the genes light up inside the brain.

By 1996 standards, these are sophisticated devices. Ten years from now, however, they may seem primitive compared to the stunning new windows into the brain that will have been developed.

Brain imaging was invented for medical diagnosis. But its far greater importance is that it may very well confirm, in ways too precise to be disputed, certain theories about "the mind," "the self," "the soul," and "free will" that are already devoutly believed in by scholars in what is now the hottest field in the academic world, neuroscience. Granted, all those skeptical quotation marks are enough to put anybody on the qui vive right away, but Ultimate Skepticism is part of the brilliance of the dawn I have promised.

Neuroscience, the science of the brain and the central nervous system, is on the threshold of a unified theory that will have an impact as powerful as that of Darwinism a hundred years ago. Already there is a new Darwin, or perhaps I should say an updated Darwin, since no one ever believed more religiously in Darwin I than he does. His name is Edward O. Wilson. He teaches zoology at Harvard, and he is the author of two books of extraordinary influence, The Insect Societies and Sociobiology: The New Synthesis. Not "A" new synthesis but "The" new synthesis; in terms of his stature in neuroscience, it is not a mere boast.

Wilson has created and named the new field of sociobiology, and he has compressed its underlying premise into a single sentence. Every human brain, he says, is born not as a blank tablet (a tabula rasa) waiting to be filled in by experience but as "an exposed negative waiting to be slipped into developer fluid." You can develop the negative well or you can develop it poorly, but either way you are going to get precious little that is not already imprinted on the film. The print is the individual's genetic history, over thousands of years of evolution, and there is not much anybody can do about it. Furthermore, says Wilson, genetics determine not only things such as temperament, role preferences, emotional responses, and levels of aggression, but also many of our most revered moral choices, which are not choices at all in any free–will sense but tendencies imprinted in the hypothalamus and limbic regions of the brain, a concept expanded upon in 1993 in a much–talked–about book, The Moral Sense, by James Q. Wilson (no kin to Edward O.).

The neuroscientific view of life

This, the neuroscientific view of life, has become the strategic high ground in the academic world, and the battle for it has already spread well beyond the scientific disciplines and, for that matter, out into the general public. Both liberals and conservatives without a scientific bone in their bodies are busy trying to seize the terrain. The gay rights movement, for example, has fastened onto a study published in July of 1993 by the highly respected Dean Hamer of the National Institutes of Health, announcing the discovery of "the gay gene." Obviously, if homosexuality is a genetically determined trait, like left–handedness or hazel eyes, then laws and sanctions against it are attempts to legislate against Nature. Conservatives, meantime, have fastened upon studies indicating that men's and women's brains are wired so differently, thanks to the long haul of evolution, that feminist attempts to open up traditionally male roles to women are the same thing: a doomed violation of Nature.

Wilson himself has wound up in deep water on this score; or cold water, if one need edit. In his personal life Wilson is a conventional liberal, PC, as the saying goes—he is, after all, a member of the Harvard faculty—concerned about environmental issues and all the usual things. But he has said that "forcing similar role identities" on both men and women "flies in the face of thousands of years in which mammals demonstrated a strong tendency for sexual division of labor. Since this division of labor is persistent from hunter–gatherer through agricultural and industrial societies, it suggests a genetic origin. We do not know when this trait evolved in human evolution or how resistant it is to the continuing and justified pressures for human rights."

"Resistant" was Darwin II, the neuroscientist, speaking. "Justified" was the PC Harvard liberal. He was not PC or liberal enough. Feminist protesters invaded a conference where Wilson was appearing, dumped a pitcher of ice water, cubes and all, over his head, and began chanting, "You're all wet! You're all wet!" The most prominent feminist in America, Gloria Steinem, went on television and, in an interview with John Stossel of ABC, insisted that studies of genetic differences between male and female nervous systems should cease forthwith.

But that turned out to be mild stuff in the current political panic over neuroscience. In February of 1992, Frederick K. Goodwin, a renowned psychiatrist, head of the federal Alcohol, Drug Abuse, and Mental Health Administration, and a certified yokel in the field of public relations, made the mistake of describing, at a public meeting in Washington, the National Institute of Mental Health's ten–year–old Violence Initiative. This was an experimental program whose hypothesis was that, as among monkeys in the jungle—Goodwin was noted for his monkey studies—much of the criminal mayhem in the United States was caused by a relatively few young males who were genetically predisposed to it; who were hardwired for violent crime, in short. Out in the jungle, among mankind's closest animal relatives, the chimpanzees, it seemed that a handful of genetically twisted young males were the ones who committed practically all of the wanton murders of other males and the physical abuse of females. What if the same were true among human beings? What if, in any given community, it turned out to be a handful of young males with toxic DNA who were pushing statistics for violent crime up to such high levels? The Violence Initiative envisioned identifying these individuals in childhood, somehow, some way, someday, and treating them therapeutically with drugs. The notion that crime–ridden urban America was a "jungle," said Goodwin, was perhaps more than just a tired old metaphor.

That did it. That may have been the stupidest single word uttered by an American public official in the year 1992. The outcry was immediate. Senator Edward Kennedy of Massachusetts and Representative John Dingell of Michigan (who, it became obvious later, suffered from hydrophobia when it came to science projects) not only condemned Goodwin's remarks as racist but also delivered their scientific verdict: Research among primates "is a preposterous basis" for analyzing anything as complex as "the crime and violence that plagues our country today." (This came as surprising news to NASA scientists who had first trained and sent a chimpanzee called Ham up on top of a Redstone rocket into suborbital space flight and then trained and sent another one, called Enos, which is Greek for "man," up on an Atlas rocket and around the earth in orbital space flight and had thereby accurately and completely predicted the physical, psychological, and task–motor responses of the human astronauts, Alan Shepard and John Glenn, who repeated the chimpanzees' flights and tasks months later.) The Violence Initiative was compared to Nazi eugenic proposals for the extermination of undesirables. Dingell's Michigan colleague, Representative John Conyers, then chairman of the Government Operations Committee and senior member of the Congressional Black Caucus, demanded Goodwin's resignation—and got it two days later, whereupon the government, with the Department of Health and Human Services now doing the talking, denied that the Violence Initiative had ever existed. It disappeared down the memory hole, to use Orwell's term.

A conference of criminologists and other academics interested in the neuroscientific studies done so far for the Violence Initiative—a conference underwritten in part by a grant from the National Institutes of Health—had been scheduled for May of 1993 at the University of Maryland. Down went the conference, too; the NIH drowned it like a kitten. Last year, a University of Maryland legal scholar named David Wasserman tried to reassemble the troops on the QT, as it were, in a hall all but hidden from human purview in a hamlet called Queenstown in the foggy, boggy boondocks of Queen Annes County on Maryland's Eastern Shore. The NIH, proving it was a hard learner, quietly provided $133,000 for the event but only after Wasserman promised to fireproof the proceedings by also inviting scholars who rejected the notion of a possible genetic genesis of crime and scheduling a cold–shower session dwelling on the evils of the eugenics movement of the early twentieth century. No use, boys! An army of protesters found the poor cringing devils anyway and stormed into the auditorium chanting, "Maryland conference, you can't hide—we know you're pushing genocide!" It took two hours for them to get bored enough to leave, and the conference ended in a complete muddle with the specially recruited fireproofing PC faction issuing a statement that said: "Scientists as well as historians and sociologists must not allow themselves to provide academic respectability for racist pseudoscience." Today, at the NIH, the term Violence Initiative is a synonym for taboo. The present moment resembles that moment in the Middle Ages when the Catholic Church forbade the dissection of human bodies, for fear that what was discovered inside might cast doubt on the Christian doctrine that God created man in his own image.

Even more radio–active is the matter of intelligence, as measured by IQ tests. Privately—not many care to speak out—the vast majority of neuroscientists believe the genetic component of an individual's intelligence is remarkably high. Your intelligence can be improved upon, by skilled and devoted mentors, or it can be held back by a poor upbringing—i.e., the negative can be well developed or poorly developed—but your genes are what really make the difference. The recent ruckus over Charles Murray and Richard Herrnstein's The Bell Curve is probably just the beginning of the bitterness the subject is going to create.

Not long ago, according to two neuroscientists I interviewed, a firm called Neurometrics sought out investors and tried to market an amazing but simple invention known as the IQ Cap. The idea was to provide a way of testing intelligence that would be free of "cultural bias," one that would not force anyone to deal with words or concepts that might be familiar to people from one culture but not to people from another. The IQ Cap recorded only brain waves; and a computer, not a potentially biased human test–giver, analyzed the results. It was based on the work of neuroscientists such as E. Roy John 1, who is now one of the major pioneers of electroencephalographic brain imaging; Duilio Giannitrapani, author of The Electrophysiology of Intellectual Functions; and David Robinson, author of The Wechsler Adult Intelligence Scale and Personality Assessment: Toward a Biologically Based Theory of Intelligence and Cognition and many other monographs famous among neuroscientists. I spoke to one researcher who had devised an IQ Cap himself by replicating an experiment described by Giannitrapani in The Electrophysiology of Intellectual Functions. It was not a complicated process. You attached sixteen electrodes to the scalp of the person you wanted to test. You had to muss up his hair a little, but you didn't have to cut it, much less shave it. Then you had him stare at a marker on a blank wall. This particular researcher used a raspberry–red thumbtack. Then you pushed a toggle switch. In sixteen seconds the Cap's computer box gave you an accurate prediction (within one–half of a standard deviation) of what the subject would score on all eleven subtests of the Wechsler Adult Intelligence Scale or, in the case of children, the Wechsler Intelligence Scale for Children—all from sixteen seconds' worth of brain waves. There was nothing culturally biased about the test whatsoever. What could be cultural about staring at a thumbtack on a wall? The savings in time and money were breathtaking. The conventional IQ test took two hours to complete; and the overhead, in terms of paying test–givers, test–scorers, test–preparers, and the rent, was $100 an hour at the very least. The IQ Cap required about fifteen minutes and sixteen seconds—it took about fifteen minutes to put the electrodes on the scalp—and about a tenth of a penny's worth of electricity. Neurometrics's investors were rubbing their hands and licking their chops. They were about to make a killing.

In fact—nobody wanted their damnable IQ Cap!

It wasn't simply that no one believed you could derive IQ scores from brainwaves—it was that nobody wanted to believe it could be done. Nobody wanted to believe that human brainpower is...that hardwired. Nobody wanted to learn in a flash that...the genetic fix is in. Nobody wanted to learn that he was...a hardwired genetic mediocrity...and that the best he could hope for in this Trough of Mortal Error was to live out his mediocre life as a stress–free dim bulb. Barry Sterman of UCLA, chief scientist for a firm called Cognitive Neurometrics, who has devised his own brain–wave technology for market research and focus groups, regards brain–wave IQ testing as possible—but in the current atmosphere you "wouldn't have a Chinaman's chance of getting a grant" to develop it.

Science is a court from which there is no appeal

Here we begin to sense the chill that emanates from the hottest field in the academic world. The unspoken and largely unconscious premise of the wrangling over neuroscience's strategic high ground is: We now live in an age in which science is a court from which there is no appeal. And the issue this time around, at the end of the twentieth century, is not the evolution of the species, which can seem a remote business, but the nature of our own precious inner selves.

The elders of the field, such as Wilson, are well aware of all this and are cautious, or cautious compared to the new generation. Wilson still holds out the possibility—I think he doubts it, but he still holds out the possibility—that at some point in evolutionary history, culture began to influence the development of the human brain in ways that cannot be explained by strict Darwinian theory. But the new generation of neuroscientists are not cautious for a second. In private conversations, the bull sessions, as it were, that create the mental atmosphere of any hot new science—and I love talking to these people—they express an uncompromising determinism.

They start with the most famous statement in all of modern philosophy, Descartes's "Cogito ergo sum," "I think, therefore I am," which they regard as the essence of "dualism," the old–fashioned notion that the mind is something distinct from its mechanism, the brain and the body. (I will get to the second most famous statement in a moment.) This is also known as the "ghost in the machine" fallacy, the quaint belief that there is a ghostly "self" somewhere inside the brain that interprets and directs its operations. Neuroscientists involved in three–dimensional electroencephalography will tell you that there is not even any one place in the brain where consciousness or self–consciousness (Cogito ergo sum) is located. This is merely an illusion created by a medley of neurological systems acting in concert. The young generation takes this yet one step further. Since consciousness and thought are entirely physical products of your brain and nervous system—and since your brain arrived fully imprinted at birth—what makes you think you have free will? Where is it going to come from? What "ghost," what "mind," what "self," what "soul," what anything that will not be immediately grabbed by those scornful quotation marks, is going to bubble up your brain stem to give it to you? I have heard neuroscientists theorize that, given computers of sufficient power and sophistication, it would be possible to predict the course of any human being's life moment by moment, including the fact that the poor devil was about to shake his head over the very idea. I doubt that any Calvinist of the sixteenth century ever believed so completely in predestination as these, the hottest and most intensely rational young scientists in the United States at the end of the twentieth.

Since the late 1970s, in the Age of Wilson, college students have been heading into neuroscience in job lots. The Society for Neuroscience was founded in 1970 with 1,100 members. Today, one generation later, its membership exceeds 26,000. The Society's latest convention, in San Diego, drew 23,052 souls, making it one of the biggest professional conventions in the country. In the venerable field of academic philosophy, young faculty members are jumping ship in embarrassing numbers and shifting into neuroscience. They are heading for the laboratories. Why wrestle with Kant's God, Freedom, and Immortality when it is only a matter of time before neuroscience, probably through brain imaging, reveals the actual physical mechanism that sends these mental constructs, these illusions, synapsing up into the Broca's and Wernicke's areas of the brain?

Which brings us to the second most famous statement in all of modern philosophy: Nietzsche's "God is dead." The year was 1882. (The book was Die Fröhliche Wissenschaft [The Gay Science].) Nietzsche said this was not a declaration of atheism, although he was in fact an atheist, but simply the news of an event. He called the death of God a "tremendous event," the greatest event of modern history. The news was that educated people no longer believed in God, as a result of the rise of rationalism and scientific thought, including Darwinism, over the preceding 250 years. But before you atheists run up your flags of triumph, he said, think of the implications. "The story I have to tell," wrote Nietzsche, "is the history of the next two centuries." He predicted (in Ecce Homo) that the twentieth century would be a century of "wars such as have never happened on earth," wars catastrophic beyond all imagining. And why? Because human beings would no longer have a god to turn to, to absolve them of their guilt; but they would still be racked by guilt, since guilt is an impulse instilled in children when they are very young, before the age of reason. As a result, people would loathe not only one another but themselves. The blind and reassuring faith they formerly poured into their belief in God, said Nietzsche, they would now pour into a belief in barbaric nationalistic brotherhoods: "If the doctrines...of the lack of any cardinal distinction between man and animal, doctrines I consider true but deadly"—he says in an allusion to Darwinism in Untimely Meditations—"are hurled into the people for another generation...then nobody should be surprised when...brotherhoods with the aim of the robbery and exploitation of the non–brothers...will appear in the arena of the future."

Nietzsche's view of guilt, incidentally, is also that of neuro–scientists a century later. They regard guilt as one of those tendencies imprinted in the brain at birth. In some people the genetic work is not complete, and they engage in criminal behavior without a twinge of remorse—thereby intriguing criminologists, who then want to create Violence Initiatives and hold conferences on the subject.

Nietzsche said that mankind would limp on through the twentieth century "on the mere pittance" of the old decaying God–based moral codes. But then, in the twenty–first, would come a period more dreadful than the great wars, a time of "the total eclipse of all values" (in The Will to Power). This would also be a frantic period of "revaluation," in which people would try to find new systems of values to replace the osteoporotic skeletons of the old. But you will fail, he warned, because you cannot believe in moral codes without simultaneously believing in a god who points at you with his fearsome forefinger and says "Thou shalt" or "Thou shalt not."

Why should we bother ourselves with a dire prediction that seems so far–fetched as "the total eclipse of all values"? Because of man's track record, I should think. After all, in Europe, in the peaceful decade of the 1880s, it must have seemed even more far–fetched to predict the world wars of the twentieth century and the barbaric brotherhoods of Nazism and Communism. Ecce vates! Ecce vates! Behold the prophet! How much more proof can one demand of a man's powers of prediction?

A hundred years ago those who worried about the death of God could console one another with the fact that they still had their own bright selves and their own inviolable souls for moral ballast and the marvels of modern science to chart the way. But what if, as seems likely, the greatest marvel of modern science turns out to be brain imaging? And what if, ten years from now, brain imaging has proved, beyond any doubt, that not only Edward O. Wilson but also the young generation are, in fact, correct?

The elders, such as Wilson himself and Daniel C. Dennett, the author of Darwin's Dangerous Idea: Evolution and the Meanings of Life, and Richard Dawkins, author of The Selfish Gene and The Blind Watchmaker, insist that there is nothing to fear from the truth, from the ultimate extension of Darwin's dangerous idea. They present elegant arguments as to why neuroscience should in no way diminish the richness of life, the magic of art, or the righteousness of political causes, including, if one need edit, political correctness at Harvard or Tufts, where Dennett is Director of the Center for Cognitive Studies, or Oxford, where Dawkins is something called Professor of Public Understanding of Science. (Dennett and Dawkins, every bit as much as Wilson, are earnestly, feverishly, politically correct.) Despite their best efforts, however, neuroscience is not rippling out into the public on waves of scholarly reassurance. But rippling out it is, rapidly. The conclusion people out beyond the laboratory walls are drawing is: The fix is in! We're all hardwired! That, and: Don't blame me! I'm wired wrong!

From nurture to nature

This sudden switch from a belief in Nurture, in the form of social conditioning, to Nature, in the form of genetics and brain physiology, is the great intellectual event, to borrow Nietzsche's term, of the late twentieth century. Up to now the two most influential ideas of the century have been Marxism and Freudianism. Both were founded upon the premise that human beings and their "ideals"—Marx and Freud knew about quotation marks, too—are completely molded by their environment. To Marx, the crucial environment was one's social class; "ideals" and "faiths" were notions foisted by the upper orders upon the lower as instruments of social control. To Freud, the crucial environment was the Oedipal drama, the unconscious sexual plot that was played out in the family early in a child's existence. The "ideals" and "faiths" you prize so much are merely the parlor furniture you feature for receiving your guests, said Freud; I will show you the cellar, the furnace, the pipes, the sexual steam that actually runs the house. By the mid–1950s even anti–Marxists and anti–Freudians had come to assume the centrality of class domination and Oedipally conditioned sexual drives. On top of this came Pavlov, with his "stimulus–response bonds," and B. F. Skinner, with his "operant conditioning," turning the supremacy of conditioning into something approaching a precise form of engineering.

So how did this brilliant intellectual fashion come to so screeching and ignominious an end?

The demise of Freudianism can be summed up in a single word: lithium. In 1949 an Australian psychiatrist, John Cade, gave five days of lithium therapy—for entirely the wrong reasons—to a fifty–one–year–old mental patient who was so manic–depressive, so hyperactive, unintelligible, and uncontrollable, he had been kept locked up in asylums for twenty years. By the sixth day, thanks to the lithium buildup in his blood, he was a normal human being. Three months later he was released and lived happily ever after in his own home. This was a man who had been locked up and subjected to two decades of Freudian logorrhea to no avail whatsoever. Over the next twenty years antidepressant and tranquilizing drugs completely replaced Freudian talk–talk as treatment for serious mental disturbances. By the mid–1980s, neuroscientists looked upon Freudian psychiatry as a quaint relic based largely upon superstition (such as dream analysis — dream analysis!), like phrenology or mesmerism. In fact, among neuroscientists, phrenology now has a higher reputation than Freudian psychiatry, since phrenology was in a certain crude way a precursor of electroencephalography. Freudian psychiatrists are now regarded as old crocks with sham medical degrees, as ears with wire hairs sprouting out of them that people with more money than sense can hire to talk into.

Marxism was finished off even more suddenly—in a single year, 1973—with the smuggling out of the Soviet Union and the publication in France of the first of the three volumes of Aleksandr Solzhenitsyn's The Gulag Archipelago. Other writers, notably the British historian Robert Conquest, had already exposed the Soviet Union's vast network of concentration camps, but their work was based largely on the testimony of refugees, and refugees were routinely discounted as biased and bitter observers. Solzhenitsyn, on the other hand, was a Soviet citizen, still living on Soviet soil, a zek himself for eleven years, zek being Russian slang for concentration camp prisoner. His credibility had been vouched for by none other than Nikita Khrushchev, who in 1962 had permitted the publication of Solzhenitsyn's novella of the gulag, One Day in the Life of Ivan Denisovich, as a means of cutting down to size the daunting shadow of his predecessor Stalin. "Yes," Khrushchev had said in effect, "what this man Solzhenitsyn has to say is true. Such were Stalin's crimes." Solzhenitsyn's brief fictional description of the Soviet slave labor system was damaging enough. But The Gulag Archipelago, a two–thousand–page, densely detailed, nonfiction account of the Soviet Communist Party's systematic extermination of its enemies, real and imagined, of its own countrymen, by the tens of millions through an enormous, methodical, bureaucratically controlled "human sewage disposal system," as Solzhenitsyn called it— The Gulag Archipelago was devastating. After all, this was a century in which there was no longer any possible ideological detour around the concentration camp. Among European intellectuals, even French intellectuals, Marxism collapsed as a spiritual force immediately. Ironically, it survived longer in the United States before suffering a final, merciful coup de grace on November 9, 1989, with the breaching of the Berlin Wall, which signaled in an unmistakable fashion what a debacle the Soviets' seventy–two–year field experiment in socialism had been. (Marxism still hangs on, barely, acrobatically, in American universities in a Mannerist form known as Deconstruction, a literary doctrine that depicts language itself as an insidious tool used by The Powers That Be to deceive the proles and peasants.)

Freudianism and Marxism—and with them, the entire belief in social conditioning—were demolished so swiftly, so suddenly, that neuroscience has surged in, as if into an intellectual vacuum. Nor do you have to be a scientist to detect the rush.

Anyone with a child in school knows the signs all too well. I have children in school, and I am intrigued by the faith parents now invest—the craze began about 1990—in psychologists who diagnose their children as suffering from a defect known as attention deficit disorder, or ADD. Of course, I have no way of knowing whether this "disorder" is an actual, physical, neurological condition or not, but neither does anybody else in this early stage of neuroscience. The symptoms of this supposed malady are always the same. The child, or, rather, the boy—forty–nine out of fifty cases are boys—fidgets around in school, slides off his chair, doesn't pay attention, distracts his classmates during class, and performs poorly. In an earlier era he would have been pressured to pay attention, work harder, show some self–discipline. To parents caught up in the new intellectual climate of the 1990s, that approach seems cruel, because my little boy's problem is...he's wired wrong! The poor little tyke —the fix has been in since birth! Invariably the parents complain, "All he wants to do is sit in front of the television set and watch cartoons and play Sega Genesis." For how long? "How long? For hours at a time." Hours at a time; as even any young neuroscientist will tell you, that boy may have a problem, but it is not an attention deficit.

Nevertheless, all across America we have the spectacle of an entire generation of little boys, by the tens of thousands, being dosed up on ADD's magic bullet of choice, Ritalin, the CIBA–Geneva Corporation's brand name for the stimulant methylphenidate. I first encountered Ritalin in 1966 when I was in San Francisco doing research for a book on the psychedelic or hippie movement. A certain species of the genus hippie was known as the Speed Freak, and a certain strain of Speed Freak was known as the Ritalin Head. The Ritalin Heads loved Ritalin. You'd see them in the throes of absolute Ritalin raptures...Not a wiggle, not a peep...They would sit engrossed in anything at all...a manhole cover, their own palm wrinkles...indefinitely...through shoulda–been mealtime after mealtime...through raging insomnias...Pure methyl–phenidate nirvana...From 1990 to 1995, CIBA–Geneva's sales of Ritalin rose 600 percent; and not because of the appetites of subsets of the species Speed Freak in San Francisco, either. It was because an entire generation of American boys, from the best private schools of the Northeast to the worst sludge–trap public schools of Los Angeles and San Diego, was now strung out on methylphenidate, diligently doled out to them every day by their connection, the school nurse. America is a wonderful country! I mean it! No honest writer would challenge that statement! The human comedy never runs out of material! It never lets you down!

Meantime, the notion of a self—a self who exercises self–discipline, postpones gratification, curbs the sexual appetite, stops short of aggression and criminal behavior—a self who can become more intelligent and lift itself to the very peaks of life by its own bootstraps through study, practice, perseverance, and refusal to give up in the face of great odds—this old–fashioned notion (what's a boot strap, for God's sake?) of success through enterprise and true grit is already slipping away, slipping away...slipping away...The peculiarly American faith in the power of the individual to transform himself from a helpless cypher into a giant among men, a faith that ran from Emerson ("Self–Reliance") to Horatio Alger's Luck and Pluck stories to Dale Carnegie's How to Win Friends and Influence People to Norman Vincent Peale's The Power of Positive Thinking to Og Mandino's The Greatest Salesman in the World —that faith is now as moribund as the god for whom Nietzsche wrote an obituary in 1882. It lives on today only in the decrepit form of the "motivational talk," as lecture agents refer to it, given by retired football stars such as Fran Tarkenton to audiences of businessmen, most of them woulda–been athletes (like the author of this article), about how life is like a football game. "It's late in the fourth period and you're down by thirteen points and the Cowboys got you hemmed in on your own one–yard line and it's third and twenty–three. Whaddaya do?..."

Sorry, Fran, but it's third and twenty–three and the genetic fix is in, and the new message is now being pumped out into the popular press and onto television at a stupefying rate. Who are the pumps? They are a new breed who call themselves "evolutionary psychologists." You can be sure that twenty years ago the same people would have been calling themselves Freudian; but today they are genetic determinists, and the press has a voracious appetite for whatever they come up with.

The most popular study currently—it is still being featured on television news shows, months later—is David Lykken and Auke Tellegen's study at the University of Minnesota of two thousand twins that shows, according to these two evolutionary psychologists, that an individual's happiness is largely genetic. Some people are hardwired to be happy and some are not. Success (or failure) in matters of love, money, reputation, or power is transient stuff; you soon settle back down (or up) to the level of happiness you were born with genetically. Three months ago Fortune devoted a long takeout, elaborately illustrated, of a study by evolutionary psychologists at Britain's University of Saint Andrews showing that you judge the facial beauty or handsomeness of people you meet not by any social standards of the age you live in but by criteria hardwired in your brain from the moment you were born. Or, to put it another way, beauty is not in the eye of the beholder but embedded in his genes. In fact, today, in the year 1996, barely three years before the end of the millennium, if your appetite for newspapers, magazines, and television is big enough, you will quickly get the impression that there is nothing in your life, including the fat content of your body, that is not genetically predetermined. If I may mention just a few things the evolutionary psychologists have illuminated for me over the past two months:

The male of the human species is genetically hardwired to be polygamous, i.e., unfaithful to his legal mate. Any magazine–reading male gets the picture soon enough. (Three million years of evolution made me do it!) Women lust after male celebrities, because they are genetically hardwired to sense that alpha males will take better care of their offspring. (I'm just a lifeguard in the gene pool, honey.) Teenage girls are genetically hardwired to be promiscuous and are as helpless to stop themselves as dogs in the park. (The school provides the condoms.) Most murders are the result of genetically hardwired compulsions. (Convicts can read, too, and they report to the prison psychiatrist: "Something came over me...and then the knife went in." 2)

Where does that leave self–control? Where, indeed, if people believe this ghostly self does not even exist, and brain imaging proves it, once and for all?

So far, neuroscientific theory is based largely on indirect evidence, from studies of animals or of how a normal brain changes when it is invaded (by accidents, disease, radical surgery, or experimental needles). Darwin II himself, Edward O. Wilson, has only limited direct knowledge of the human brain. He is a zoologist, not a neurologist, and his theories are extrapolations from the exhaustive work he has done in his specialty, the study of insects. The French surgeon Paul Broca discovered Broca's area, one of the two speech centers of the left hemisphere of the brain, only after one of his patients suffered a stroke. Even the PET scan and the PET reporter gene/PET reporter probe are technically medical invasions, since they require the injection of chemicals or viruses into the body. But they offer glimpses of what the noninvasive imaging of the future will probably look like. A neuroradiologist can read a list of topics out loud to a person being given a PET scan, topics pertaining to sports, music, business, history, whatever, and when he finally hits one the person is interested in, a particular area of the cerebral cortex actually lights up on the screen. Eventually, as brain imaging is refined, the picture may become as clear and complete as those see–through exhibitions, at auto shows, of the inner workings of the internal combustion engine. At that point it may become obvious to everyone that all we are looking at is a piece of machinery, an analog chemical computer, that processes information from the environment. "All," since you can look and look and you will not find any ghostly self inside, or any mind, or any soul.

Thereupon, in the year 2006 or 2026, some new Nietzsche will step forward to announce: "The self is dead"—except that being prone to the poetic, like Nietzsche I, he will probably say: "The soul is dead." He will say that he is merely bringing the news, the news of the greatest event of the millennium: "The soul, that last refuge of values, is dead, because educated people no longer believe it exists." Unless the assurances of the Wilsons and the Dennetts and the Dawkinses also start rippling out, the lurid carnival that will ensue may make the phrase "the total eclipse of all values" seem tame.

The two most fascinating riddles of the 21st century

If I were a college student today, I don't think I could resist going into neuroscience. Here we have the two most fascinating riddles of the twenty–first century: the riddle of the human mind and the riddle of what happens to the human mind when it comes to know itself absolutely. In any case, we live in an age in which it is impossible and pointless to avert your eyes from the truth.

Ironically, said Nietzsche, this unflinching eye for truth, this zest for skepticism, is the legacy of Christianity (for complicated reasons that needn't detain us here). Then he added one final and perhaps ultimate piece of irony in a fragmentary passage in a notebook shortly before he lost his mind (to the late–nineteenth–century's great venereal scourge, syphilis). He predicted that eventually modern science would turn its juggernaut of skepticism upon itself, question the validity of its own foundations, tear them apart, and self–destruct. I thought about that in the summer of 1994 when a group of mathematicians and computer scientists held a conference at the Santa Fe Institute on "Limits to Scientific Knowledge." The consensus was that since the human mind is, after all, an entirely physical apparatus, a form of computer, the product of a particular genetic history, it is finite in its capabilities. Being finite, hardwired, it will probably never have the power to comprehend human existence in any complete way. It would be as if a group of dogs were to call a conference to try to understand The Dog. They could try as hard as they wanted, but they wouldn't get very far. Dogs can communicate only about forty notions, all of them primitive, and they can't record anything. The project would be doomed from the start. The human brain is far superior to the dog's, but it is limited nonetheless. So any hope of human beings arriving at some final, complete, self–enclosed theory of human existence is doomed, too.

This, science's Ultimate Skepticism, has been spreading ever since then. Over the past two years even Darwinism, a sacred tenet among American scientists for the past seventy years, has been beset by...doubts. Scientists—not religiosi—notably the mathematician David Berlinski ("The Deniable Darwin," Commentary, June 1996) and the biochemist Michael Behe (Darwin's Black Box, 1996), have begun attacking Darwinism as a mere theory, not a scientific discovery, a theory woefully unsupported by fossil evidence and featuring, at the core of its logic, sheer mush. (Dennett and Dawkins, for whom Darwin is the Only Begotten, the Messiah, are already screaming. They're beside themselves, utterly apoplectic. Wilson, the giant, keeping his cool, has remained above the battle.) By 1990 the physicist Petr Beckmann of the University of Colorado had already begun going after Einstein. He greatly admired Einstein for his famous equation of matter and energy, E=mc2, but called his theory of relativity mostly absurd and grotesquely untestable. Beckmann died in 1993. His Fool Killer's cudgel has been taken up by Howard Hayden of the University of Connecticut, who has many admirers among the upcoming generation of Ultimately Skeptical young physicists. The scorn the new breed heaps upon quantum mechanics ("has no real–world applications"..."depends entirely on fairies sprinkling goofball equations in your eyes"), Unified Field Theory ("Nobel worm bait"), and the Big Bang Theory ("creationism for nerds") has become withering. If only Nietzsche were alive! He would have relished every minute of it!

Recently I happened to be talking to a prominent California geologist, and she told me: "When I first went into geology, we all thought that in science you create a solid layer of findings, through experiment and careful investigation, and then you add a second layer, like a second layer of bricks, all very carefully, and so on. Occasionally some adventurous scientist stacks the bricks up in towers, and these towers turn out to be insubstantial and they get torn down, and you proceed again with the careful layers. But we now realize that the very first layers aren't even resting on solid ground. They are balanced on bubbles, on concepts that are full of air, and those bubbles are being burst today, one after the other."

I suddenly had a picture of the entire astonishing edifice collapsing and modern man plunging headlong back into the primordial ooze. He's floundering, sloshing about, gulping for air, frantically treading ooze, when he feels something huge and smooth swim beneath him and boost him up, like some almighty dolphin. He can't see it, but he's much impressed. He names it God.

Tom Wolfe has chronicled American popular culture for more than three decades. His best–selling books include The Electric Kool–Aid Acid Test, The Right Stuff, and The Bonfire of the Vanities.

This article can be found on the Tetrica website (link closed). Reprinted with permission.

Posted: 2003



Copyright © 2001-2018OrthodoxyToday.org. All rights reserved. Any reproduction of this article is subject to the policy of the individual copyright holder. Follow copyright link for details.

Copyright © 2001-2018 OrthodoxyToday.org. All rights reserved. Any reproduction of this article is subject to the policy of the individual copyright holder. See OrthodoxyToday.org for details.



Leave a Comment

(0 Comments)

Your email address will not be published. Required fields are marked *