THR Summer 2015 PDF

THE BODY IN QUESTION Christine Rosen the flesh made word David Bosworth the new immortalists Mark Edmundson body and so

Views 69 Downloads 0 File size 3MB

Report DMCA / Copyright

DOWNLOAD FILE

Recommend stories

Citation preview

THE BODY IN QUESTION

Christine Rosen the flesh made word David Bosworth the new immortalists Mark Edmundson body and soul Rebecca Lemov the data-driven body Gordon Marino lessons from the ring

SUMMER 2015

WWW.H EDGEHOG R E V IE W.COM

Summer 2015 / Volume Seventeen / Number Two

17.2 FROM THE EDITORS / 6 NOTES AND COMMENTS A Disease Just Like Any Other / 8 Joseph E. Davis

Are We There Yet? / 10 B.D. McClay

Throwing Away the Key / 12 Lisa Lorish

THE BODY IN QUESTION / 15 The Flesh Made Word: Tattoos, Transgression, and the Modified Body / 16 Christine Rosen

The New Immortalists / 26 David Bosworth

Body and Soul / 38 Mark Edmundson

On Not Being There: The Data-Driven Body at Work and at Play / 44 Rebecca Lemov

Lessons from the Ring—Then and Now / 56 Gordon Marino

ESSAYS The Witness of Literature: A Genealogical Sketch / 64 Alan Jacobs

On the Value of Not Knowing Everything / 78 James McWilliams

The Great Subversion: The Scandalous Origins of Human Rights / 90 Ronald Osborn

The Common Core and Democratic Education / 102 Johann N. Neem

BOOK REVIEWS The Ransom of the Soul: Afterlife and Wealth in Early Western Christianity by Peter Brown / 112 Reviewed by Karl Shuve

Literary Criticism from Plato to Postmodernism: The Humanistic Alternative by James Seaton / 115 Reviewed by Steven Knepper

Higher Education in America by Derek Bok / 117 Reviewed by Chad Wellmon

The Other Solzhenitsyn: Telling the Truth about a Misunderstood Writer and Thinker by Daniel J. Mahoney / 119 Reviewed by James L. Nolan Jr.

The Paradox of Liberation: Secular Revolutions and Religious Counterrevolutions by Michael Walzer / 121 Reviewed by Jay Tolson

SIGNIFIERS Narrative / 125 Wilfred M. McClay

The fox knows many things, but the hedgehog knows one big thing. —Archilochus

WWW.HEDGEHOGREVIEW.COM Publisher:

JOSEPH E. DAVIS

Editor:

JAY TOLSON

Managing Editor:

LEANN DAVIS ALSPAUGH

Associate Editor:

B.D. MCCLAY

Circulation Manager:

MONICA POWELL

Copy Editor:

VINCENT ERCOLANO

Designer:

BRANNER GRAPHIC DESIGN

TO FIND OUT MORE about The Hedgehog Review or the Institute for Advanced Studies in Culture, or to order a subscription (print $25/digital $10) or single issue (print $12/digital $5), please contact us: IASC, P.O. Box 400816, University of Virginia, Charlottesville, VA 22904-4816 Email: [email protected] / Web: www.hedgehogreview.com / Phone: (434) 243-8935 Cover: The Dancer, 1913, by Egon Schiele (1890–1918); Leopold Museum, Vienna; HIP/Art Resource, NY.

© 2015 Institute for Advanced Studies in Culture ISSN 1527-9677 (Print) / ISSN 2324-867X (Digital) All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means without permission in writing from the Editors. All statements of opinion or fact are the responsibility of the author alone and not of The Hedgehog Review. The Hedgehog Review is indexed or abstracted by the Humanities International Index, MLA International Bibliography, Sociological Abstracts, Worldwide Political Science Abstracts, International Political Science Abstracts, International Bibliography of Social Sciences, Gale, and EBSCO Publishing. The Hedgehog Review is a member of CELJ and is distributed by Ubiquity and Ingram.

“Meet the 6-year-old mag that just took the internet by storm.” –COLUMBIA JOURNALISM REVIEW

DIGITAL EDITION NOW AVAILABLE ON APPLE NEWSSTAND AND AMAZON

Download today to enjoy single issues and exclusive subscription offers:  www.psmag.com/apple  www.psmag.com/amazon

For print subscriptions, call 866-368-4320 or order online at PSmag.com

PS_SubAd_HedgehogReview_Jan 2015.indd 1

1/29/15 10:24 AM

5

FROM THE EDITORS MARK BAUERLEIN, A PROFESSOR OF ENGLISH AT EMORY UNIVERSITY,

recalls a startling pronouncement once made by a fellow academic: “Oh, it’s very important that everyone does something to the body!” What struck Bauerlein, even more than his colleague’s words, was how he meant them: “as a general moral injunction, a necessity.” That hortatory urgency, almost religious in its intensity, resonates with something deep and widespread in our culture, something powerfully anticipated in the famous closing line of Rainer Maria Rilke’s “Archaic Torso of Apollo”: “You must change your life.” As philosopher Peter Sloterdijk argues (in a book that takes the poet’s line as its title), Rilke was one of the early prophets of the self-transformative enterprise that has become the object of so many religion-like practices that flourish in the late modern world. Self-making, or self-remaking, is at once the great liberation, challenge, and burden of our age, bound up as it with the larger project of identity (elective identity, in particular) and what we have come to call identity politics. What does all this have to do with the body? Quite simply that new ways of understanding and treating the body—the focus of the thematic essays in this issue—are one strong expression of the relocation of the sacred that is bound up with our self-making projects. What we make of our bodies, in short, testifies to a great rupture. For most of human history, the body enjoyed special, even hallowed, status because humans believed that it was a gift, bestowed upon them by a power greater than themselves. To those within theistic traditions, this power was divine; to those from nontheistic traditions, that power might be the inherent order of nature and the cosmos—the Tao, for example, of many Asian spiritual traditions. The body, by both understandings, was an endowment intended to support humans in their passage through this world, ideally for the attainment of a good life marked by virtue, heroism, or wisdom. It was also a forceful reminder of both the limits and commonality of the human. Modernity’s slow onslaught on the traditional gods, as announced by Nietzsche and other thinkers, did not destroy the sacred itself. Instead, the sacred was relocated, dispersed, renamed. And new understandings of the sacred were supported by new dogmas, many of which were formulated and expounded in the academy. As Bauerlein says, summarizing the philosophy of his colleague:

6

The body is NOT a natural thing or divine form. It has no natural or supernatural status. That’s what my friend meant when he insisted on coloring hair, writing words on forearms, inserting studs in tongues, and otherwise modifying the physique. We must de-naturalize the body, redefine it as a human construct.… Yes, each one of us is stuck with the one we’ve got (at this point in time), but we can re-create it, fashioning it into an expression of the identity we prefer. Champions of identity and identity politics who see this new understanding as an unquestionable good might reconsider. Even those who see the modern turn to identity as a necessary and laudable challenge to older conceptions of a universal human nature—conceptions that were often used to advance the interests of the privileged few of the right class, race, or gender—may sense how a project intended to liberate can easily be turned to opposite ends. Consider, for example, how the preoccupation with identity and difference has been exploited through the subtle (and not-sosubtle) processes of objectification, commodification, and commercialization. There are even larger stakes. The reduction of the body to a kind of billboard or platform, to something infinitely malleable, inscribable, or plastic, comports all too neatly with the denial of a common humanity—and of those rights attendant upon that shared humanity—that is now in the ascendant around the world. Although his essay is not a part of our thematic treatment, Ronald Osborn’s reflections on the origins of our conception of universal human rights is an apposite reminder that it descends from the particular and “scandalous” proposition that all humans deserve equal treatment and consideration because all are created in the image of God, the Imago Dei. The body—with all its imperfections, individual variations, and limitations—tethers us to the sacred ground of our being. We deny that connection at our possible peril.

7

Notes and Comments A DISEASE JUST LIKE ANY OTHER

The suicide last summer of the actor and comedian Robin Williams sparked the latest round in our ongoing public discussion of mental illness. Predictably, once speculation about details of his life subsided, the concern shifted to “stigma” (prejudice and discrimination) and how mental illness is misunderstood and misrepresented. Unlike the reality of physical illness, the now-familiar argument runs, the reality of mental illness is denied, shrouded instead in benighted moral judgments. “It is NOT cowardly to suffer or seek help,” tweeted Williams’s daughter on World Mental Health Day in October, giving fresh voice to this widely shared position. As early as 1961, the Joint Commission on Mental Illness and Health—“the last word 8

of the mental health establishment”—in its final report, Action for Mental Health, lamented how the public had turned “deaf ears” to the psychiatric “cardinal tenet” that the “mentally ill…are sick in the same sense as the physically sick,” and therefore should be understood and treated just as nonjudgmentally as those with somatic afflictions. This tenet, the commission made clear, was not only a scientific truth, but also critical to overcoming negative attitudes and fostering “humane, healing care.” That rationale has since informed countless public campaigns by government commissions, professional associations, and advocacy organizations to promote mental health “literacy” and improved treatment. Adding a new twist in recent decades, such campaigns now place a central emphasis on

neuroscience, both to promote the view that mental health problems arise from neurobiological aberrations and to undermine resistance to the use of neurobiological (i.e., psychopharmaceutical) solutions. Pharmaceutical companies, which actually fund many of these campaigns, advance the same viewpoint. The strategy of promoting disease theories to the public arose long before the “decade of the brain” (1990–2000), and does not now reflect any actual breakthrough in our understanding of brain abnormalities. While there is constant talk of “revolutionized thinking” and “dramatic improvements in understanding the biology of mental illness,” the reality is far more pedestrian. Indeed, if anything, new findings in genetics and neuroscience are doing more to

NOTES AND COMMENTS

undermine current ideas and diagnostic categories than to provide new models or biological markers, and have so far yielded no treatment discoveries. The effective medications remain those that were discovered serendipitously long ago, the last jackpot being the class of antidepressants referred to as selective serotonin reuptake inhibitors (Prozac, for example), introduced in the 1980s. Even now, we don’t understand the relationship between the action of psychiatric medications (e.g., increasing serotonin levels) and the causes of disorder. This is not to downplay the scientific research or its potential. It is simply to observe that clinical payoffs remain aspirational. The brain-disease promotion strategy rests not on scientific advances but on an assumption—one might even say a wager—about personal responsibility. The logic behind this wager has three steps. First, the root cause of the stigma that attaches to mental health problems lies in misguided attributions of responsibility by sufferers and non-sufferers alike. Both mistakenly believe that problems are caused by moral failings or weak willpower. The effort to combat stigma must begin by undercutting this blame. Second, treating a mental health problem as a somatic disorder or disease presupposes the presence of some underly-

ing and dysfunctional mechanism in the individual that reduces volitional control. (Talk of disease thus always implies a removal of personal responsibility. The addition of overstated claims from neuroscience, which suggest that underlying somatic mechanisms have in fact been identified, fortifies and justifies the disease model.) Third, once the burden of responsibility is lifted, sufferers will be open to seeking help and accepting and staying in treatment. This, in turn, will have the consequence of improving how other people react to them. This logic has an intuitive and, paradoxically, moral appeal. Promotion of the “disease just like any other” view is widely regarded, in the words of one study, to be a measure of a “liberal, knowledgeable, benevolent, supportive orientation toward the mentally ill.” No wonder anti-stigma promoters have so doggedly conducted campaigns in its terms. But there is a catch. Studies of the relationship between belief in brain disorders and stigmatizing attitudes do not find greater tolerance, either among psychiatric patients or the general public. If anything, they show just the opposite. Over the past two decades, both the general public and patients themselves—historically quite resistant to disease models of emotional and psychological problems—have become rela-

tively more biologically minded. This change correlates with an increased endorsement of seeking professional help and using prescription medicine, as well as some reduction in attributions of blame: three of the central goals of the public anti-stigma campaigns (and pharmaceutical advertising). The change, however, also correlates with more intensely negative attitudes toward those with mental health problems, attitudes that are shared even by sufferers themselves. Promoting the brain disease model has had the intended consequence of contributing to the medicalization of distress and boosting the sale of psychopharmaceuticals. If that wasn’t troubling enough, the campaign has also had the unintended consequence of promoting the very stigma and discrediting attributes it sought to reduce. What went wrong? Is this just more evidence of ignorant laypeople drawing the wrong conclusions after finally catching on to the brain disease idea? Or does the problem lie with the wager on responsibility itself? My interviews with people over the years suggest that trading personal agency for a kind of medical absolution exacts a high price. After all, what level of brain abnormality is implied by the claim that diagnosed persons are not responsible for their actions? Surely, it must be a high level—one at which behavior is 9

THE HEDGEHOG REVIEW / SUMMER 2015

effectively determined by forces, disease processes, beyond a person’s intentional control. And this would be a level we normally associated with only the most severe psychoses. At stake in the mental domain, but far less in the strictly physical one, is our very notion of personhood. Should it be a surprise that accepting a disease model that effectively strips people of their agency and freedom leads to prognostic pessimism for sufferers and avoidance of sufferers by others? The model, as my interviews suggest, implies a categorical difference, a situating of sufferers outside the community of viable selves in which persons are responsible for their actions. Contrary to the theory, removing responsibility results in a less favorable and more patronizing view of sufferers—and as a result, more stigma, and more isolation. —Joseph E. Davis

ARE WE THERE YET?

Books and articles diagnosing America’s spiritual afflictions often ask the same questions: Can women have it all? Where have all the men gone? Are we bad parents? Why are college students so dumb? And where, oh where, are all the adults? That last question has recently acquired particular urgency, perhaps because it now seems 10

to subsume many of the others. Anxiety over adulthood has even become a miniature growth industry, feeding off the concerns of the young as well as their elders. Although each generation tends to doubt that the ones following it will be capable of launching themselves, something different appears to be happening now: It’s the young themselves who seem most to doubt their ability to assume adulthood. Among my fellow late-ish twenty-somethings, the word adult has become a verb. “Who let me adult?” an acquaintance asked recently after purchasing a home. When adult becomes a verb, adulthood itself becomes something new: more an act to be performed than a state to be once and fully attained. For young people suspended between dependency and independence, the material touchstones of adulthood—a house, a car, a benefitsbearing job—seem far beyond their reach. To assist the would-be performer of adulthood, one organization, the Society of Grownups, employs “professional grownups” who “help the next generation embrace their inner adult.” It offers classes in building stock portfolios, negotiating salaries, cooking adult meals, and planning “grownup trips.” (The last class is titled “Beyond the Hostel,” implying that a grownup trip is necessarily a costly one.)

For those who would rather help themselves, there is Kelly Williams Brown’s best-selling self-help book Adulting: How to Become a Grown-up in 468 Easy(ish) Steps. The cover shows Brown, wearing a retro-styled blue cocktail dress, sitting in a clean room on a tan leather couch with a pizza box at her feet. Adulthood here is not only a verb but a lifestyle, one that requires spending a certain amount of money on one’s furniture and clothes. An adult is someone who has attained enough maturity and wherewithal to purchase a tan leather couch and keep it clean. The pizza box functions as a wink: Even Brown isn’t an adult all the time. In 1962, the young woman striking out on her own might have purchased a very different book (by a different Brown). Sex and the Single Girl, by Cosmopolitan editor Helen Gurley Brown, was not the first selfhelp guide aimed at young professional women. Vogue editor Marjorie Hillis had helped break that ground in 1936 with Live Alone and Like It. But the title of Brown’s book was significant because it spoke to girls, not to women—more specifically, to women who aspired to remain girls. “The single woman…is emerging as the newest glamour girl of our times,” Brown wrote. “She is free to be The Girl in a man’s life, whether he is married or single himself.”

NOTES AND COMMENTS

Sex and the Single Girl was a book for the young woman uninterested in shouldering the old responsibilities of marriage or the new political responsibilities of feminism; it supported a quasi independence that was still defined around the needs of men. Brown’s single women were smart (but not too smart), stylish, and thin. She offered them “a study on…how to stay single—in superlative style.” Married life was cast, by contrast, as dull and full of thankless obligations (and, of course, the promise of eventual faithlessness on the part of the husband, who was sure to become bored). Was Brown’s advice bad? Undoubtedly—but it resonated with other critiques of the time, including Sloan Wilson’s The Man in the Gray Flannel Suit (1955), Rona Jaffe’s The Best of Everything (1958), Richard Yates’s Revolutionary Road (1961), and Mary McCarthy’s The Group (1963). All of these books expressed a deep skepticism over whether the security promised by American adulthood was worth the price. “The worst fate,” McCarthy wrote of her Vassar graduate heroines in The Group, “they utterly agreed, would be to become like Mother and Dad, stuffy and frightened. Not one of them, if she could help it, was going to marry a broker or a banker or a coldfish corporation lawyer, like so many of Mother’s generation.”

As early as 1949, the journalist William H. Whyte produced his first article exposing the docile products of the American education system as potentially “so tractable and harmonious as to be incapable…of making provocative decisions.” American middle-class adulthood, as dissected by social critics and novelists of the fifties and sixties, was simply unfit for adults. The roles it provided were too thin and flimsy for people with real ideals or aspirations. But in rejecting those roles, many of the young rebels of subsequent decades experienced different kinds of frustration and struggled with achieving adulthood themselves. Perhaps not surprisingly, those staid and secure roles now look more appealing to many of today’s young adults. The historian Steven Mintz, in his recent book The Prime of Life: A History of Modern Adulthood, tracks the shift from adulthood “defined by…marriage, a full-time job, home ownership, and childrearing” to adulthood “as an outlook and self-image, defined by financial independence, a distinctive state of mind, and particular symbols of psychological maturity.” Mintz characterizes the American attitude toward adulthood as ambivalence. But it’s also anxiety about being unable to escape adolescence. The twenty-somethings of the 1960s share with the twentysomethings of the 2010s an

anxiety over their choices and their roles. Today, there are fewer formal limits placed on choices, but the choices seem impossible to make. People begin to think nostalgically of roles. “If only nobody was telling us what to do” becomes “if only we were told what to do and how to decide.” But neither of these is really the problem. Even within the best possible range of choices, choosing one thing means not choosing another. No matter how much we should try to structure society so that most people have the best possible choices available to them, there will still be people who are unhappy, regretful, or disappointed. If Mintz is correct in concluding that we are shifting away from an adulthood defined by institutional and material attainments, our new adulthood will be one that is both easier and harder to attain. If adulthood comes to be defined not by success or financial stability but by an acceptance of limits, then the only person who will truly know whether or not he or she is an adult will be—you, yourself. Such certainty is hard to come by, and that difficulty makes adulthood almost synonymous with anxiety over achieving adulthood. To those graduating from college and establishing themselves, take heart: You already adult. —B.D. McClay

11

THE HEDGEHOG REVIEW / SUMMER 2015

THROWING AWAY THE KEY

When he was eighteen, Rene Lima-Marin and a friend robbed two Colorado video stores of a total of about $11,500, threatening the employees of both establishments with a gun. Each man was charged with two counts of first-degree burglary and three counts of aggravated robbery. Under pressure from years of increasing gang violence in Denver, Colorado’s Eighteenth Judicial District had answered the growing public outcry with tough new sentencing protocols. Lima-Marin found himself labeled a chronic offender. Offered a sentence of seventy-five years if he would plead guilty, he decided to risk going to trial, hoping that a lenient judge would find some or all of the evidence inadmissible. That didn’t happen, and Lima-Marin ended up paying what the National Association for Criminal Defense Lawyers calls a “trial penalty.” Convicted, Lima-Marin was sentenced to ninety-eight years. What makes the case remarkable is what happened next. In what his attorney legitimately believed was the result of an appeal (but was in fact the product of clerical error), Lima-Marin came up for parole after serving only a decade of what was in effect a life sentence. Released, he immediately moved in with his former 12

girlfriend and became her son’s stepfather. He found and held jobs. The couple married, became regular churchgoers, bought a home, and had a son together. Lima-Marin mentored at-risk youth and coached his stepson’s soccer team. He committed no new crimes and successfully completed his five years on parole. Then, in 2014, five years and eight months after he was released, Lima-Marin received a call notifying him that his release had been a mistake and that a judge had signed the order for his arrest. He was picked up the very same day and, after a quick hearing, was taken back to prison, where he faced at least seventy-five more years before possible parole. He was in his thirties at the time. All students of criminal law learn that there are five different justifications for the punishment of those who commit crimes: retribution, deterrence, rehabilitation, restoration, and incapacitation. In federal criminal practice, these rationales are explicitly spelled out by statute. Yet with all of these considerations supposedly in play, the vast majority of criminal sentences in the United States are handled with two tools, sometimes combined: fines and incarceration (often followed by a period of supervised release or probation). The criminal justice system’s bluntest tool—incarceration—takes account of ret-

ribution (punishing a societal wrong), incapacitation (keeping someone dangerous off the streets), and deterrence (providing a disincentive to others to commit similar kinds of criminal acts). Financial penalties reflect the need for restoration (making a wrong right), at least in cases of fraud and theft, though for the greatest number of offenses that result in fines (driving offenses or other “crimes against society”), the imposition of a monetary payment appears to be more about retribution and deterrence than anything else. Moreover, when fines with quickly accruing interest go unpaid, incarceration often results. But while fines and incarceration satisfy most justifications for punishment, it would be hard to argue that they do anything toward rehabilitation. Indeed, with many jails and prisons now offering prisoners little or no access to educational opportunities, vocational training, or mental health or addiction treatment, few would say that incarceration is serving any rehabilitative purpose. Although not alone in doing so, philosopher Jonathan Jacobs makes a persuasive case that incarceration, far from being rehabilitative, usually has a corrosive effect on the character of prisoners. He points to several contributing factors: a lack of autonomy; the impression, at least, that disciplinary regula-

NOTES AND COMMENTS

tions and sanctions are arbitrary; the constant threat of violence; the lack of meaningful social interactions; inadequate mental health care. Additional postincarceration hurdles faced by a convicted felon (lack of access to government benefits and loans, severely limited employment possibilities) only increase the likelihood that he or she will be driven back to crime. The most recent US Bureau of Justice Statistics study on recidivism rates for state prisoners released from incarceration (published in 2005) produced staggering findings. Two-thirds of the state prisoners tracked in the study were re-arrested (not necessarily re-convicted) within three years of their release. That figure jumped to three-quarters within five years of release. The case of Lima-Marin should make us stop and ask why we punish, and what happens to those we punish. No longer used by the federal government and many states, the parole system formerly provided for indeterminate sentences, with the possibility of earlier release depending on a defendant’s behavior and demonstrated rehabilitation. The movement to abolish parole in the 1990s coincided with the push to legislate mandatory minimum sentences, the aim of both being the elimination of discretion, and therefore discrepancies, among criminal sentences.

But uniform sentencing is both a blessing (arguably counteracting racial and other biases) and a curse (removing the ability of a judge or parole board to assess individual offenders). The elimination of parole also removes the political risk that a recently paroled offender will commit a crime that will spark public outrage directed against not only the perpetrator but also elected officials and the judiciary. Ultimately, though, the demise of parole sends a deeply demoralizing message to the incarcerated: We don’t care what you do to try to rebuild your life while you are in prison. In much-publicized contrast with the American system, Norway caps criminal sentences at twenty-one years, extending them in five-year increments only if it is determined that an offender is not rehabilitated by the end of his or her initial term. By rebuilding his life without committing another crime within five years after his release, Lima-Marin beat the odds, and he did so after serving only about a tenth of his sentence. In addition to making us think about the sheer length of the sentence he received at age eighteen, might his story suggest that rehabilitation needs a more prominent place in our thinking about the means and ends of punishment? —Lisa Lorish

Signs of the times

Follow The Hedgehog Review on Facebook and Twitter.

13

Praise for New Releases from the Faculty and Fellows of the

“The crisis of the university in the age of MOOCs and the new media? As Chad Wellmon shows in this learned and lucid study, we’ve been there before, several times. Tracing the development of the university, Wellmon gives us a thought-provoking account of an astonishingly resilient institution. He also offers rich material for reflection on the meaning of the life of the mind, whether pursued in the classroom, the library, the laboratory or online.” —Lorraine Daston, Max Planck Institute for the History of Science, Berlin

Organizing Enlightenment

i n f o r m at i on ov e r loa d

and the invention of the modern researc h university

C h a d We l l m o n

Organizing Enlightenment: Information Overload and the Invention of the Modern Research University Chad Wellmon Johns Hopkins University Press

“The World Beyond Your Head is an enormously rich book, a timely and important reflection on an increasingly important subject. Pay attention.” —Ian Tuttle, The New Criterion

“Persuasive, entertaining—and sometimes disturbing.” —Sarah Bakewell, Financial Times

The World Beyond Your Head: On Becoming an Individual in an Age of Distraction Matthew B. Crawford Farrar, Straus, and Giroux

iasc-culture.org 14

THE BODY IN QUESTION

O

ur bodies, ourselves? In one sense, of course. But the things we now do to our bodies, whether through tattooing, piercing, or sculpting, and the ways we attempt to perfect or transcend them, whether through extreme fitness regimes, self-tracking, or artificial enhancements, suggest new, if not fully articulated, conceptions of the human person and the ends and purposes of human existence. These conceptions have a history, of course. They derive in part from a centuries-old confidence in the power of science to fix, extend, and possibly even “immortalize” our physical selves. They resonate with the American dream of self-remaking and the New Adam. And they recast the Protestant concern with the born-again experience in secular and material terms (see Bosworth’s essay). But these ideas have been transformed and popularized through association with assorted projects reflecting our highly individualistic and commodified culture, from identity politics and transhumanism to the Quantified Self movement to assorted cults of body modification. “Today,” writes Christine Rosen in her essay, “devotees of body modifications are a thriving subculture with their own social networks, e-zines, websites, conventions, and celebrities.” However different in particulars, all such projects share a view of the body as a malleable object, subject only to personal whim or desire. That view prevails because we now have bodies seemingly devoid of souls, and therefore bodies quite different from what humans long thought them to be: as part of what Ralph Waldo Emerson called a “stupendous antagonism” (see Edmundson’s essay). It may be the supreme irony that the death of the soul presages the demise of the body. The merger of humans with smart machines— the “Singularity” sought after by assorted futurists and high-tech visionaries—is already underway. It is visible in the extensive humanmachine interactions of people at work and play. But as historian of science Rebecca Lemov notes in her essay, the visible is often ignored: “Despite the fact that the vacant or overly tracked body is increasingly the condition of certain kinds of repetitive and exploitative work, the body remains in the background of our awareness.” And despite the various attentions we now lavish on the body, the body itself may be losing its true magisterium (see Marino’s essay). No longer a source of wisdom about human limits and potential, it is now seen as a means of self-transformation, an instrument in the pursuit of perfection—or an equally elusive immortality.

15

The Flesh Made Word Tattoos, Transgression, and the Modified Body Christine Rosen

I

n 1882, the Duke of York, who later became King George V of England, traveled to Japan with his brother, the Duke of Clarence and Avondale. Prince George had an audience with Emperor Meiji and, according to historian Donald Keene, presented Empress Maruko with two wallabies from Australia. He also visited a tattoo artist, who inscribed a dragon on the arm of the future king (as well as one on his brother).1 They were not the first royals to have themselves tattooed—twenty years earlier, their father, King Edward VII, had had a Jerusalem cross tattooed on his arm during a visit to the Holy Land. And in 1066, King Harold II’s tattoos were used to identify his body after he died at the Battle of Hastings. Human beings have always marked themselves. Özti the Iceman, a Bronze Age man whose 5,000-year-old remains were found in the Alps on the Austria-Italy border, had several tattoos, including a small cross behind his left knee. Using a computed tomography scan, researchers at the British Museum recently discovered that a female Egyptian mummy dating from 700 CE had the name “Michael” tattooed on her thigh. Tattoos and other body modifications have long been a way to mark one’s membership in a group. Members of indigenous tribes, practitioners of certain religions, sailors, prisoners, and gang members have all used permanent body marking as a way to signal belonging. Today, devotees of body modification are a thriving subculture with their own social networks, e-zines, websites, conventions, and celebrities. They embrace not only tattooing but also practices such as scarification (deliberate scarring of the skin), subdermal

Christine Rosen is senior editor of The New Atlantis: A Journal of Technology and Society and a Future Tense Fellow at the New America Foundation. She is the author of Preaching Eugenics: Religious Leaders and the American Eugenics Movement and The Extinction of Experience (forthcoming). Right: “My Tatts Are Personal” by Elizabeth Waugh; Photolibrary/Getty Images.

16

17

THE HEDGEHOG REVIEW / SUMMER 2015

implants such as bumps and spikes on the forehead, various body piercings and dental modifications, and stretching of the lips, earlobes, and nostrils, among other body parts. The heavily tattooed men and women who used to be displayed as “freaks” at carnival sideshows would barely get a second glance at a contemporary body modification convention such as ModCon. In an era of excessive individualism, our markings and modifications are viewed not as a sign of freakishness or outlier behavior but as an expression of personal taste, devoid of historical or cultural baggage. I doubt that the waitress at my favorite pizza place, who has a delicate butterfly tattoo winding up her wrist, thought much about the fact that In an era of excessive individualism, her ink gives her a shared history, stretching back centuries, with both British royalty and our markings and modifications are prison gang members. “I just thought it was viewed not as a sign of freakishness or beautiful,” she told me, when I asked her why she got tattooed. And it is. outlier behavior but as an expression But as body modification becomes more of personal taste, devoid of historical individually expressive and less an expression of affiliation, its cultural meaning becomes or cultural baggage. clouded. What does the weakening of stigma associated with some body modifications suggest about cultural change? What does our embrace of the extremes of body modification reveal about our understanding of the integrity of the human body? What do these extremes have to teach us about more accepted cultural practices such as dieting and cosmetic surgery? And what motivates us to do these things to ourselves?

A Personal Mantra Tattoos are a useful case study because they have moved from stigma to acceptance and back again many times in history. Once the province of criminals and other stigmatized groups, in the past thirty years tattoos have become a mainstream feature of American culture. You can find tattoos on people of nearly every class and race. Today, “It was spring break” is just as likely to be the answer to the question of why someone got tattooed as “I was in prison” was for previous generations. People now embrace tattoos to honor someone who died, to commemorate an important life event, or to have a permanent reminder of a personal mantra. Julia Gnuse, an American woman with more than 95 percent of her body tattooed, originally began covering herself with ink to mask the ravages of porphyria, a condition that leads to blistering and scarring of the skin. A 2012 Harris Interactive poll found that one in five American adults has a tattoo, with people thirty to thirty-nine more likely to be tattooed than members of any other age group. Evidently, most people who get “inked” don’t regret their tattoos; 86 percent of the respondents said they never had. The association of tattoos with deviance or criminality has apparently faded; 75 percent of the people surveyed said that having a tattoo didn’t make a difference in what they thought about someone’s likely behavior.2 18

THE FLESH MADE WORD / ROSEN

Demographically, there are few differences among those who do and don’t get tattooed: Slightly more Hispanic people than white or black people have tattoos, and slightly more women than men. Politically, tattoos have nearly bipartisan appeal—17 percent of Republicans versus 22 percent of Democrats and 21 percent of independents have them. The military has loosened restrictions on tattoos for enlisted personnel, and popular musicians and celebrities frequently display their tattoos in movies and photos. More extreme forms of body modification are even making appearances in the realm of high fashion: The fall 2015 Givenchy runway show featured models with (fake) pierced septums and glued-on stones meant to resemble cheek piercings, and a photographer, Christian Saint, recently published a book titled Tattoo Super Models. “I think people are realizing that tattoos are not that different from fashion itself,” said Saint in an interview with the British newspaper The Daily Mail. “Their artwork is as much of who they are as the clothes they wear.”3 Extreme modification practices are also seeping into the mainstream. The checkout clerk at my grocery store has two black stretchers in his earlobes. When I asked him about them, he shrugged and said he “just wanted to try it out.” The holes are about half an inch in diameter now and no more noticeable than a small pair of hoop earrings. In a culture that has embraced cheek, lip, pectoral, and many other kinds of implants and injections routinely sought by cosmetic surgery patients, it seems unfair to label a man with stretched earlobes as any more or less freakish than a woman with distended, cosmetically enhanced lips. The line between cosmetic and therapeutic practices appears no longer to exist in contemporary culture; judging by the amount of money people spend on it every year (more than $12 billion in the United States in 2013), cosmetic It seems unfair to label a man with surgery is therapy for many people.4 As we begin to view body modification as stretched earlobes as any more or an expression of individual aesthetic preferless freakish than a woman with ence and less as a marker of deviant behavior, cultural theorists have a harder time making distended, cosmetically enhanced lips. sense of it. The field of cultural studies often looks to the margins to better understand the center, and often takes a celebratory rather than critical approach to those it deems “transgressive.” This is problematic with regard to body modification. Only 25 percent of the people with tattoos surveyed by Harris Interactive said getting one made them feel more rebellious. (Far more claimed it made them feel “sexy.”) Once a practice is mainstreamed and commodified and stripped of its subversive associations, can it really be called transgressive?

Defining Deviance The difficulty of defining deviance is one of the problems that bedevil Beverly Yuen Thompson’s recent book, Covered in Ink, her “ethnography” of what she calls a “deviant group”: heavily tattooed women. The author, herself one of these women, begins with 19

THE HEDGEHOG REVIEW / SUMMER 2015

the obligatory discussion of her own victim status: “As a mixed-race Chinese/White and petite woman, I face stereotyping.”5 Thompson is highly sensitive about what others think about her tattoos. Throughout the book, she complains that strangers ask “silly and uninformed questions” about them, such as “Did that hurt?” or “What does that mean?” She is offended when people compliment her by saying “Nice tats,” because she feels that it sounds too much like “Nice tits.” “I felt that my tattoos were beautiful and reflective of my inner self, yet I feared misunderstanding from the general public,” she writes. Even so, most of the other heavily tattooed women she interviews describe their experience in public space as generally positive, and are less bothered by the unwelcome critical remarks they sometimes receive.6 In spite of this diversity of experience, Thompson feels she must politicize the meaning of the “socially sanctioning glares” she claims to receive on a regular basis. Citing sociologist Erving Goffman’s pioneering work on social interaction in public space, she argues that tattooed women face especially hurtful stigma and opprobrium in public because they are acting in a way that transgresses social boundaries.7 French sociologist Pierre Bourdieu claimed that evaluating the deficiencies of another’s appearance is one of the ways the “petit bourgeois” exercise their power over others whom they deem “vulgar.” “These refusals, almost always expressed in the mode of distaste, are often accompanied by pitying or indignant remarks about the corresponding French sociologist Pierre Bourdieu tastes. (‘I can’t understand how anyone can like claimed that evaluating the that!’),” he wrote.8 But in a society where both the powerful and deficiencies of another’s appearance the powerless embrace modifications such as tatis one of the ways the “petit toos, refusal goes both ways. Once, when I was with my young sons, we encountered someone bourgeois” exercise their power over who had visible, graphic tattoos of scantily clad others whom they deem “vulgar.” women on his arms. My kids naturally asked me about the tattoos (within earshot of the man who was tattooed). I explained to them that he had the right to put whatever he wanted on himself, just as they had a right to have an opinion about it. Although I would rather not have had my five-year-old boys see highly sexualized images of naked ladies on a man’s arm, I don’t think seeing those tattoos harmed them. But what if his tattoos had included racist statements? Would I have been within the bounds of civil behavior to say something about his uncivil display? In his work, Goffman describes the rules of civil interaction in public space as akin to a delicate dance—a give-and-take that requires the active and thoughtful participation of both people in a social encounter if it is to go smoothly. Thompson, by contrast, wants others, despite their understandable curiosity, to suppress their reactions to and interpretations of her highly visible tattoos; there is no give-and-take, only her autonomy. She even suggests that the non-tattooed can’t understand the culture of the tattooed, and condescendingly describes the efforts of a young woman who wanted to make a documentary about tattooed women by noting her failure to find participants: 20

THE FLESH MADE WORD / ROSEN

Because the would-be filmmaker was “non-tattooed, her knowledge about the culture was lacking, and this came through in her approach.”9

Spectacle, Performance, and Power What emerges from Thompson’s approach is a need to impose her own categories on others’ expressions of themselves. A woman with one tattoo isn’t transgressive, Thompson argues, because it is now socially acceptable to have “small, cute, and hidden tattoos.”10 That tiny dolphin that you had inked on your ankle in college isn’t transgressive enough—you are still laboring under false consciousness and trying to live up to misguided standards of female beauty. If your leg is tattooed with zombies Trying to impose theory on such a or skulls or snakes, however, you’ve struck a blow against the patriarchy. But what if you diversity of tattoo practices, as cultural live in a conservative religious community studies practitioners are wont to do, where tattooing is forbidden? One tiny, hidden, “feminine” tattoo might represent a far often sheds more heat than light. greater challenge to authority than a sleeve of tats would on an atheist barista in Brooklyn. There is an inherent contradiction in the cultural studies work of scholars such as Thompson. She wants tattoos (her own and those of others) to mean something, something that aligns with her view of power and social relationships in contemporary culture. She wants them to show that women are using their power to upend social expectations by embracing a previously masculine practice (heavy tattooing) and claiming it as their own. But she also wants people to pretend not to notice the results of their upending of expectations. The final chapter of her book includes a guide to “tattoo etiquette” in which she advises non-tattooed people (that is, the majority of the population) how to behave. You could glean far better tips on etiquette and a far more complex view of the cultural tropes of body modification by watching one of the many reality television shows that feature tattoo artists. Series such as Miami Ink and Black Ink Crew offer intimate glimpses of the business of tattooing. Some of the shows follow the model of cosmetic-surgery reality programming, featuring botched procedures and sensationalistic personality clashes meant to fuel ratings. But others offer insight into the motivations of people who want tattoos—and there are as many motivations as there are styles of body modification. Trying to impose theory on such a diversity of practices, as practitioners of cultural studies are wont to do, often sheds more heat than light. And what about the more extreme modification practices? Experimental performance artist Ron Athey regularly inserts large needles and metal hooks into his skin, as well as performing scarification and branding in front of live audiences. He calls this self-harm a form of art, a commentary on being an HIV-positive gay man who grew up in a restrictive Pentecostal household. “My work always has a philosophical question, a thesis,” Athey told a reporter for Vice. One of his facial tattoos is a teardrop below his eye.11 21

THE HEDGEHOG REVIEW / SUMMER 2015

Other extreme body modifiers have become celebrities and performance artists who make their living displaying their modifications: Maria Jose Cristerna, a Mexican mother of four and former lawyer who now works as a disc jockey, calls herself “Vampire Lady”; she has had her teeth filed down to fangs, has been tattooed on nearly every inch of her skin, and had titanium horns implanted in her head. Eric Sprague, a performance artist from Texas who calls himself “Lizardman,” has piercings, tattoos, and implants along his forehead; he even had his tongue surgically bifurcated to resemble a lizard’s tongue. To many people, the more extreme forms of body modification suggest a kind of debasement of the human form, a rejection of the body rather than a celebration of it. Scholars such as Sheila Jeffreys have criticized body modifiers’ invocation of autonomy and postmodern feminism to justify their practices. She calls body modification a form of “mutilation” and argues that it is “a result of, rather than resistance to, the occupation of a despised social status under male dominance.” 12 She views body modifiers as more like the young women who engage in self-harm practices such as cutting or those who suffer from eating disorders. On Tumblr and other social media platforms, you can find countless teenage boys and girls hosting “nonjudgmental” pages documenting various forms of self-harm such as cutting and eating disorders. What is it that makes one form of self-inflicted violence “art” and another an expression of depression or other mental illness? Many of the techniques celebrated by cultural theorists as transgressive (ear stretching, certain forms of scarification) are appropriated from indigenous people who themselves have often been considered victims of oppression by cultural theorists. Fakir Mustafar (born Roland Loomis), a self-described “Master Piercer and shaman,” has been experimenting on himself for decades and is considered to be the founder of a “modern primitivism” that embraces a range of body modification practices. Now in his eighties, he offers “Fakir Intensives”— workshops on the “art, skills, and magic The notion of empowerment, like the of body piercing and branding”—and by branding he does not mean the techniques descriptor transgressive, can be used to of marketers and advertisers.13 describe so many things that it has lost In her book In the Flesh: The Cultural Politics of Body Modification, scholar much of its rhetorical force. Victoria Pitts describes a young man active in body modification: “His attitude toward the body is postmodern and cyberpunk—he mixes tribal and high-tech practices to create a hybrid style and sees the body as a limitless frontier for exploration and technological innovation.” Pitts believes that his actions “create not only spectacle and controversy but also new forms of social rebellion through the body.”14 But Pitts and other theorists must also acknowledge that many of their “transgressive” subjects are white, middle-class Western men and women who are not acting so much like “modern primitives” intent on “postcolonial discourse” as they are like the customers at the Build-a-Bear stores that dot American malls: adding and subtracting modifications to create an expression of something lovably personal, an ideal 22

THE FLESH MADE WORD / ROSEN

Tattooed woman, Shanghai, China, 2007; PYMCA/UIG/Bridgeman Images.

expression of their individual aesthetic preferences, not a commentary on power and social norms.

The Meaning of the Body Cultures get the theory they deserve. The logical conclusion of our excessively individualistic and commodified culture is memoir masquerading as theory, and personal experience standing in for social empowerment. Take the work of Lianne McTavish, a professor of art at the University of Alberta, who engaged in “embodied research” by entering the Northern Alberta Bodybuilding Championships. Claiming that she was not subject to the male gaze but appropriating it, she published a scholarly book, Feminist Figure Girl: Look Hot While You Fight the Patriarchy, in which she describes her pursuit of a “visibly muscular X shape, with wide shoulders and lats that taper into a narrow waist then flare out again with chiseled glutes and hams.” She argues that bodybuilding isn’t merely a stereotypically masculine domain; she found acceptance of a wide range of physical appearances and an “open and flexible practice” not unlike yoga. But unlike yoga, her punishing, months-long training and diet regimen ended not with her achieving inner peace but competing against other women on a stage while slathered with tanning dye and decked out in a “tiny blue velvet bikini” and plastic high heels, a vision of female empowerment likely unimaginable to, say, the nineteenthcentury suffragists.15 Then again, the notion of empowerment, like the descriptor transgressive, can be used to describe so many things that it has lost much of its rhetorical force. Is every 23

THE HEDGEHOG REVIEW / SUMMER 2015

porn star who buys enormous breast implants empowering herself by enhancing her body’s market value, or becoming a victim of patriarchy by conforming to its demands? Some cultural theorists argue that plastic surgery can never be transgressive because the women having it are conforming to beauty stereotypes (often while keeping their conformity “hidden” by lying about their surgical alterations); plastic surgery junkies pursue a “normative body project,” as one cultural theorist has argued. Body modification advocates, by contrast, make a deliberately public statement about their bodies that challenges norms. “Pierced, scarred, and tattooed…the body is a site of symbolic resistance, a source of personal empowerment, and the basis for the creation of a sense of self-identity,” declares Daniel Wocjik, an English professor who has written about body modification.16 But what is normal? With the virtual increasingly replacing the real, perhaps the real has to become more extreme to feel genuine. As we perform and live more of our lives online, our memories and experiences and identities take on an increasingly ephemeral and homogenized quality; a metal spike through the ear, by contrast, is a lastingly corporeal statement. And as body modification itself becomes normalized, the need to attach theories to people’s motivations disappears. Both of my sisters are tattooed—my older sister has one small tattoo (which cultural theorists such as Thompson would define as typically feminine and thus not genuinely transgressive), and my younger sister is extensively tattooed. Neither regrets getting inked. Nor do my sisters entertain complicated theories about what their tattoos might mean to anyone else. They got them because they wanted to. Their tattoos are an expression of their sense of self, like the clothing they wear or the hobbies they pursue. Cultural theorists would say that that’s only the beginning of the story, not the end. They would be right in one sense: Body modification is always a cultural signifier because the body is the site of our views about what a person is (or should be) and how that person should (or should not) behave. There are reasons why we modify our own bodies, and there are reasons why social groups pressure their members to look a certain way. Societies generally want conformity (and stability) from their members, but individuals within social groups often seek to highlight their difference and individuality. If practitioners of body modification want the freedom to see their bodies as expressions of their individuality, then they must also accept that others might freely express their disapproval. In a world where we’re encouraged to rank, review, promote, “like,” or retweet every meal we eat and item we purchase, why should someone else’s aesthetic choices, however quixotic, be free from our relentless, instantaneous rush to judgment? Our current approach to body modification—haphazard, arbitrary, and driven almost entirely by individual preference—is not without its own risks. By embracing modification as personal preference, we avoid wrestling with some important questions: What is the meaning of the body? Is it something sacred, a temporary gift that we have a responsibility to use well? Is it a bequest from God, or nature, for which we bear responsibility? Today we treat our bodies like material possessions over which we have exclusive ownership and, we incorrectly assume, total control. But questions about the human 24

THE FLESH MADE WORD / ROSEN

body will only become more important in the near future, when we will have access to a range of new technological and genetic enhancements that will force us to confront what it even means to be human. The conversations we should be having aren’t about deviance and power and the fetishization of difference; they are about the integrity of the human body. After all, our physical bodies are the means by which we understand ourselves and the world, and the greatest proof of our shared history as human beings. They are what we have in common with each other, no matter how much we attempt to change. That understanding, like the tattooists’ skill with needle and ink, is something we must cultivate if we don’t want it to fade.

Endnotes 1

Donald Keene, Emperor of Japan: Meiji and His World, 1852−1912 (New York: Columbia University Press, 2002), 811.

2

Harris Interactive, “One in Five U.S. Adults Now Has a Tattoo,” February 23, 2012; http://www.harrisinteractive.com/NewsRoom/HarrisPolls/tabid/447/mid/1508/articleId/970/ctl/ReadCustom%20 Default/Default.aspx.

3

Christian Saint, Tattoo Super Models (New York: Goliath Books, 2015); Maybelle Morgan and Toni Jones, “Think Tattoos Are for Thugs? Think Again,” Daily Mail Online, March 13, 2015; http://www. dailymail.co.uk/femail/article-2990264/The-stunning-models-covered-intricate-tattoos-vying-makeinkings-big-thing-catwalk.html.

4

“Statistics, Surveys, and Trends,” American Society for Aesthetic Plastic Surgery, March 20, 2014; http:// www.surgery.org/media/news-releases/the-american-society-for-aesthetic-plastic-surgery-reports-americans-spent-largest-amount-on-cosmetic-surger.

5

Beverly Yuen Thompson, Covered in Ink: Tattoos, Women, and the Politics of the Body (New York: New York University Press, 2015), 5.

6

Ibid., 3−4.

7

Ibid., 4, 19.

8

Pierre Bourdieu, Distinction: A Social Critique of the Judgment of Taste (Cambridge, MA: Harvard University Press, 1984), 58, 61.

9

Thompson, Covered in Ink, 12.

10 Ibid.,

10.

11 Amelia

Abraham, “Ron Athey Bleeds for His Art,” Vice, September 24, 2014; http://www.vice.com/ en_uk/read/ron-athey-performance-art-amelia-abraham-121.

12 Sheila

Jeffreys, “‘Body Art’ and Social Status: Cutting, Tattooing, and Piercing from a Feminist Perspective,” Feminism and Psychology 10, no. 4 (2000), 410; http://www.brown.uk.com/selfinjury/jeffreys.pdf.

13 This

information on Mustafar is from his website, www.fakir.org.

14 Victoria

Pitts, In the Flesh: The Cultural Politics of Body Modification (New York: Palgrave Macmillan, 2003), 2.

15 Lianne

McTavish, “What I Learned by Becoming a Body-Builder at Age 45,” The New Republic, March 30, 2015; http://www.newrepublic.com/article/121408/professor-studied-her-own-stint-bodybuilder. See also McTavish, Feminist Figure Girl: Look Hot While You Fight the Patriarchy (Albany, NY: SUNY Press, 2015).

16 Daniel Wocjik, Punk and Neo-Tribal Body Art (Jackson, MS: University Press of Mississippi, 1995); Keith

Alexander, “About Piercing,” Body Modification Ezine, 1999, quoted in Jeffreys.

25

The New Immortalists David Bosworth

“In two hundred years, doctors will rule the world. Science reigns already. It reigns in the shade, maybe—but it reigns. And all science must culminate at last in the science of healing. Mankind wants to live—to live.” —Comrade Ossipon, in Joseph Conrad’s The Secret Agent

Tattoos drilled into every curving surface from neck to feet to advertise our latest beliefs and heartfelt allegiances; rings and studs protruding from every possible appendage; Botox shots whose neurotoxin, paralyzing facial muscles, temporarily removes the history of our moods by erasing laugh and frown lines; scalpel-sculpting surgeries that suck away ungainly fat from sides and thighs or nip-and-tuck to iron out the shriveling caused by too much sun and gravity; various and sometimes severe diets based on intricate theories of human development (paleo, Viking, very low calorie); fanatical physical training pursued either to hone a seductive appearance (the actor’s six-pack abs) or to win the laurels of extreme achievement (the Ironman Triathlon); pharmaceutical fixes broadly advertised and promiscuously prescribed for all manner of ailments, a new drug pitched, it sometimes seems, for every age-old pain and psychic misery: To borrow the title from a recent group of so-called reality TV shows, ours has been an age of the “extreme makeover.” And increasingly here in the land of opportunity, this radical remaking of the American self is being pursued through the perfection of the flesh—through beautifying, fortifying, and (just now commencing) digitizing the human body. The remaking of the self is, of course, an old theme, and one central to the most traditional conceptions of the national character and the American dream. Both Protestant

David Bosworth is an associate professor of English at the University of Washington and a widely published essayist. In addition to two prize-winning works of fiction, he is the author of The Demise of Virtue in Virtual America: The Moral Origins of the Great Recession, published last summer; a companion volume will be published in 2016. Right: Photograph by Maarten Wouters; Photonica/Getty Images.

26

THE HEDGEHOG REVIEW / SUMMER 2015

theology’s special emphasis on the born-again experience (our Puritan legacy) and the economic ambitions of the immigrants who landed here cultivated the expectation of a dramatic transformation of the individual’s status. For a long time, the dualistic nature of that expectation both energized and disciplined the American experiment in a liberated individualism; in particular, religious revivals periodically counterbalanced an avid pursuit of the main chance. Such a capacity for internal self-correction now seems to have waned, though. Attendance at religious services here does remain high, at least when compared to observance in Western Europe, but our fastest-growing congregations of late are ones that have replaced the old emphasis on the myth of the Fall with various versions of a “prosperity theology.” Rather than counteracting it, that newer religious message confirms and abets the rampant materialism of a society whose every other domain is now being marketized for monetary gain. Our coins are still stamped with “In God We Trust,” but the money, not the motto, better defines the now dominant arc of American ambition. And as the “good news” preached in mainstream pews begins to model itself after the get-rich schemes of self-help gurus, the spirit of an age that idolizes its billionaires while obsessing over the perfection of the flesh gravitates toward a kind of evangelical Mammonism. Still, even as they are co-opted and corrupted, those older beliefs do have an ongoing Our coins are still stamped with “In impact on their replacements. The underlying patterns of Christian theology and eschatolGod We Trust,” but the money, not ogy tacitly pre-shape the expectations of their the motto, better defines the now most adamant opponents. Belief in a salvational God has, for these secular evangelists, dominant arc of American ambition. been recast as faith in the redemptive power of technological progress, and lately, among the digerati especially, the old anticipation of the Second Coming is being replaced by a parallel belief in the imminent arrival of the Singularity: that pivotal point when, according to futurologists, we shall merge with our own super-smart machines and, in a kind of second but self-initiated Genesis, become new beings entirely, “born again” into a far better, if currently inconceivable, state. (“For now we see through a glass darkly; but then face to face,” 1 Corinthians 13:12, King James Version.) Under this scheme, the incorporeal soul, once thought to be the essence of selfhood, naturally gives way to a purely physical conception of our core identity. Our bodies are our selves, and, it is presumed, their animated clay can be re-scripted and re-shaped according to our individual wills—wills empowered by our ever-improving technological tools. Those tools are impressive, but, as the list that opens this essay suggests, the human desires they now bend to serve are as old as we can trace. Whether posting selfies, bloviating on blogs, or imbibing the products of today’s pharmacology, human beings still want to call attention to themselves; they still desire to be socially admired and sexually desired, to eliminate pain and escape disease. And, at once blessed and cursed with the unique capacity to imagine the future (however darkly), they still greatly fear death and seek any way possible to delay or deny it. 28

T H E N E W I M M O RTA L I S T S / B O S W O RT H

Conrad’s Comrade Ossipon spoke prophetically. Now, as then, “mankind wants to live—to live,” and although only one hundred years have passed, half of his predicted schedule, some of our physician-technicians are already focusing their “science of healing” on a final cure for the human condition. Forget the Christian’s Heaven, the Muslim’s Gardens of Paradise, the Buddhist’s escape from incarnation into Nirvana— and forget the spiritual and ethical labors required to attain those eternal states in otherworldly places. According to these latter-day doctors, death is not an inescapable fate but a technical problem, and one we soon will solve in the here-and-now by immortalizing the body itself.

You Only Live Twice “The Cryonics Institute is an ambulance ride to the high-tech hospital that we’re confident will exist in the future. When the time comes and present medical science has given up on you or your loved ones, we ask for a second opinion from the future. The choice is yours—Do you take the chance at life or accept mortal fate?” —“Why Choose Cryonics?”1 Recovering from battle wounds suffered during World War II, Robert Ettinger had mortality on his mind when he ran across research in the field of cryogenics (the formal study of materials subjected to extremely cold temperatures). Those readings led him to speculate that fatally ill patients might be frozen alive and then preserved until medical science had advanced sufficiently to revive them safely and provide new cures. His first expression of that far-out notion was in a work of speculative fiction, “The Penultimate Trump,” which appeared in 1948 in the magazine Startling Stories. Sixteen years later, with the implicit endorsement of Isaac Asimov, who was asked by Doubleday to vet the science cited, Ettinger published the nonfiction bestseller The Prospect of Immortality, to much acclaim and controversy. Then, in 1976, with the intention of turning his utopian theory into a real option, Ettinger’s Immortalist Society established the Cryonics Institute. For a one-time fee, this nonprofit institution Death is not an inescapable fate now “cryopreserves” its clients’ bodies, promising to store them until that day of reckoning when the but a technical problem. ever-improving science of healing is ready to revive them and cure their diseases. The current preferred method, first practiced by the Institute in 2004, is called vitrification, and involves replacing more than 60 percent of the water in the body with chemicals that prevent freezing; saturated in such a way, clients’ flesh can then be stored at temperatures as low as minus 320 degrees Fahrenheit without the cellular damage that ice crystals normally cause. Citing published abstracts and posting letters of support from researchers, the websites of the Institute and its primary (and better-funded) rival, Alcor, emphasize the scientific nature of their enterprise. But the fact remains that no one has been revived after undergoing vitrification, and the whole process floats on grand hopes even as it 29

THE HEDGEHOG REVIEW / SUMMER 2015

challenges traditional definitions of life and death. Is the cryopreserved body a patient, or is it a corpse? And are these new immortalists true physicians or merely high-tech morticians, producing postmodern versions of Egyptian mummies for a second life that will never come? From a legal perspective, a client must be formally pronounced dead before cryopreservation can begin. In this sense, cryogenic intervention mirrors the dramatic sequence associated with organ transplantation: constant communication with the family of the mortally ill patient; an emergency team on standby to cool the body as soon as death has been declared; a private flight back to the home base (Michigan for the Institute, Arizona for Alcor), where vitrification commences and the body is stored to await its high-tech cure. As a surgically removed kidney is still viable for a time, so, too, these new Is the cryopreserved body immortalists insist, is the whole body. Death is not an on-or-off event but itself a process, and a patient, or is it a corpse? only occurs when irreversible damage has been done to our cells’ internal structures: a chaotic state that normally takes four or five minutes, but that can be slowed by immediate cooling after the heart has stopped and then suspended indefinitely via vitrification. So long as their team arrives in time to “beat the Reaper” in this redefined way, their clients, they claim, are true patients—unconscious, not dead, and as ready to be revived when the time arrives as any heart attack victim on an ER table now would be. Many illnesses, however, can cause significant internal damage prior to death, diminishing the odds of a successful revival at some later date. This is especially true for diseases of the brain—dementia, multiple ministrokes, voracious cancers—the organ most closely associated with the very identity that the clients of cryonics are so desperate to preserve. One temptation for true believers with those diseases, then, is to hasten their entrance into cryopreservation in ways that society would define as assisted suicide, an action still illegal in most states. One such case did come to light in 1990. A computer scientist with terminal brain cancer petitioned the California courts to force reluctant surgeons to fulfill his final wish—which was to be decapitated before the tumor destroyed most of his brain cells. He had an economic motive for the drastic means he had chosen. Although committed to cryonics, he didn’t have the funds to purchase a whole-body procedure, and Alcor was offering a “neuropreservation” special. It would store the brain alone, for about one-third the cost of a whole-body treatment: $35,000 versus $100,000. From the patient’s perspective, he was facing either decapitation then, when his brain was still largely intact, or later, when his cancer would have destroyed much of his ability to think at all. With the court’s permission, he was willing to sacrifice his remaining time and most of his body in the faith that “he”—or at least the core component of the self that he believed consisted of his brain alone—could be revived at a time when future oncologists could cure his cancer.2 The courts did deny the petition of this bargain-seeking immortalist, and, given its expense and the absence of anything like a scientific consensus on the viability of its methods, cryonics more generally remains a movement on the fringes. In pursuing their 30

T H E N E W I M M O RTA L I S T S / B O S W O RT H

Futuristic Adam by Mike Agliolo/Science Source/Getty Images.

utopian mission, these organizations have also been dogged by further litigation and controversy. They have been sued by grieving family members who desire the finality of a traditional burial or cremation service, and Alcor has been accused of harvesting the DNA of its most famous patient, baseball great Ted Williams, for potential later sale.3 But even if true, that tawdry accusation doesn’t really capture the spirit of cryonics. If there is a corruption at its core, the source isn’t the usual profiteering found in our money-mad society but rather the inflation of hope—another key feature of our national character—into the boundless sphere of hubristic fantasy. Conceived by a mid-twentieth-century American, the movement supplies an extreme and thus illustrative example of postwar boosterism, its cultivation of an ever-expectant attitude, the irrepressible belief that Yankee can-do would do—whatever we wished, and soon. Fed by ceaseless marketing, that attitude was pitched in the exhibits of world fairs and in that mecca of American materialism, Disney’s Magic Kingdom; it was expressed in corporate slogans like “Progress is our most important product” and by that therapeutic mantra of perpetual self-improvement, “Every day, in every way, I’m getting better and better.” However meretricious those anthems to optimism may seem, they do have a deeper source within the history of the West’s grand ideas. Since its origins in the seventeenth century, the logic of modernity has been promising us, to repurpose a phrase by W.B. Yeats, the “profane perfection of mankind.” But as William Irwin Thompson has observed, despite having donned the rhetorical robes of scientific probity, this “doctrine of progress cannot tolerate, or even perceive, disconfirmation.”4 Like millenarians throughout history, our latter-day prophets of “profane perfection” tend to be blind to their own predictive failures. Rather than recant, they keep resetting the date, place, and specs for that extreme make-over—call it the Workers’ Paradise, the “end of history,”5 31

THE HEDGEHOG REVIEW / SUMMER 2015

or the Singularity—when the human predicament will be rationally solved once and for all. Unlike the schemes we graph in our minds, our bodies don’t continue to get “better and better”; eventually, they do wither, sag, and weaken—they begin to die. And it is just then, its teams on standby, that cryonics steps in to save the day. With the reality of its patients’ deaths denied, the most devastating “disconfirmation” of the doctrine of progress is once again deferred. Along with all those skull-less brains and headless torsos, the incurable hopes of a utopian philosophy are cryopreserved.

Fantastic Voyage “We have the means right now to live long enough to live forever.” —Ray Kurzweil, The Singularity Is Near6 Still, it would be both inaccurate and ungrateful to deny that the science of healing has made impressive progress. Since Conrad’s novel was published in 1907, life expectancy has increased dramatically; and, insomuch as single-celled life forms reproduce through division, creating duplicates of themselves ad infinitum, it also has to be admitted that physical immortality of a certain sort is not utterly alien to the natural world. Cryonics’s earliest advocates lacked convincing credentials, but a newer cast has recently emerged— from prestigious labs and that epicenter of perpetual invention, Silicon Valley—to make a science-based case for a self-generated immortality in the near future. 3-D printing techniques could eventually Armed with new knowledge in many fields, these advocates insist that aging is eliminate the need for human donors, not a metaphysical fate but a biological proand the body, like a classic car, might be cess, and, as such, one that can be arrested and eventually reversed. Aubrey de Grey, sustained indefinitely through a ceaseless a Cambridge University−trained biogeronreplacement of its various parts. tologist who focuses on the mechanisms of aging at the cellular level, claims that we already possess the basic knowledge to pursue those goals and only lack sufficient funding to make the dream come true. By 2100, he believes, “life expectancy will be in the range of 5,000 years.”7 The medical means for that radical life extension will include genomic and cellular repair using as yet undeveloped nanotechnologies, and the regeneration of living tissue through the cultivation of personalized stem-cell lines. Other, computerbased inventions are now also being applied to enhance the chances for life extension, one example of which will have to suffice here. Progress has recently been made in the field of 3-D printing by using “bio-ink” to grow replacement organs.8 If successful, this technique will eventually eliminate the need for human donors, and the body, like a classic car, might be sustained indefinitely through a ceaseless replacement of its various parts. Today’s most adamant apostle of self-generated immortality, however, is neither a physician nor a biological scientist, but a high-tech inventor and entrepreneur—the 32

T H E N E W I M M O RTA L I S T S / B O S W O RT H

professions most admired in a society driven by the profit motive. A prodigy whose software skills have been much enhanced by his endless zeal, Ray Kurzweil has accomplished seemingly impossible feats before. Called the rightful heir to Thomas Edison, he was instrumental in the development of digital scanning, revolutionized modern music by designing one of the earliest and best synthesizers, and wrote the first software program that could translate text into speech—a true boon to the blind that has already placed him high in the pantheon of can-do angels. In a future eulogy that, according to Kurzweil, no one will ever need to give, it would be fair to say that he was someone who, in his brief stay on our planet, made a real difference. The methods for achieving a triumph over death are defined in detail in three of Kurzweil’s books, beginning with one he coauthored with Dr. Terry Grossman in 2004: Fantastic Voyage: Live Long Enough to Live Forever. In it, the authors complain that “nature, for all its creativity, is dramatically suboptimal,”9 and insist that biological systems will eventually be supplanted by better biotechnologies. This will occur because of an exponential growth in our rate of technological progress—what Kurzweil calls the Law of Accelerating Returns, according to whose calculations the twenty-first century will produce the current equivalent of 20,000 years of high-tech advances. The challenge for us, then, is “liv[ing] long enough” to be the beneficiaries of the immortalizing inventions sure to come—and the way to do so is to follow the prescriptions of Ray and Terry’s Longevity Program, as it is detailed throughout Fantastic Voyage. That program consists of three stages, or “bridges,” each timed to the predicted rate of medical progress. In the first and current bridge, we are instructed to exploit all the latest diagnostic tools and nutritional supplements even while heeding a series of psychological bromides and age-old maxims, the sum of which the authors conveniently supply in a long bulleted list of imperative advice, including “Take up a new hobby,” “Use a starch blocker,” “Be optimistic,” “Schedule a fasting homocysteine determination,” and “Be like the wise bamboo, and bend.”10 Kurzweil himself swallows 250 selfformulated supplements a day (that’s 91,000 capsules each year) and, once a month, visits a medical center to receive intravenous treatments.11 His aim is to survive until the second bridge, when, he predicts, the science of healing will have advanced sufficiently to “turn off” the aging process, leading finally to the third bridge, when, utilizing nanotechnologies, we will be able to rebuild our bodies and brains at the molecular level, re-creating our selves in ever more intelligent and durable forms. The later stages of this giddy evolution are further defined in Kurzweil’s next two books, The Singularity Is Near: When Humans Transcend Biology (2005) and How to Create a Mind: The Secret of Human Thought Revealed (2012). In them, he predicts that during the 2020s computers will pass the Turing test and not only become intelligent but also “conscious.”12 In the following decade, we will replace most of our biological organs with better human-made ones, and, by 2045, the Singularity will have occurred—that is, an expansion of human intelligence by “a factor of trillions”13 through its merger with our super-smart computers—after which “there will be no difference between human and machine or between physical and virtual reality.”14 Through new technologies, we will then “vastly exceed the refinement and suppleness of the best of human traits”;15 “human civilization” will become “nonbiological 33

THE HEDGEHOG REVIEW / SUMMER 2015

for all practical purposes”;16 and, along with reversing aging, this new civilization will solve nearly all of our current social problems, including poverty and environmental devastation17—utopia as re-conceived by a computer engineer and enacted through his super-smart machines. And that’s not all. In Kurzweil’s vision, the doctrine of progress is not limited merely to the perfection of our planet and the eternal preservation of those now living. He is also planning to use nanotechnology to resurrect the dead, including his own beloved father and great historical figures such as Thomas Jefferson, so that we, the newly immortalized, might forever converse with the best and dearest minds of the departed. (Once again, a Christian belief, that we can be spiritually reunited with loved ones lost, has been revived in materialist form.) Further, because “intelligence is more powerful than cosmology,” this new non-biological civilization of ours “saturates the matter and energy in its vicinity,” and then, “will overcome gravity” and expanding at “at least the speed of light,” it will colonize the entirety of space-time and thereby “engineer the universe it wants.”18 Omniscience, omnipresence, omnipotence, immortality—insomuch as we (if we is still the proper pronoun) will achieve a status “as close to God as [Kurzweil] can imagine,”19 all the other happy endings previously predicted by the doctrine of progress seem picayune in comparison.

Genial Misanthropy “Give me the folly that dimples the cheek, I say, rather than the wisdom that curdles the blood.” —Herman Melville, The Confidence-Man Kurzweil is, first and last, a software programmer, and so each problem to be solved—in this case, death—is reconceived by him in the manner of the machinery that he knows best. In his re-mapping of the human predicament, our minds are “software,” our bodies “hardware,” and immortality, therefore, a technical problem in “data retrieval.” All we need to do is find a way to “backup our mind files”—just as we now do our e-mails, photos, and memos—and we will surely survive the inevitable demise of our original “hard drives,” our digitized selves living forever in new physical “substrates” of our own invention.20 “Death, be not proud, though some have called thee/Mighty and dreadful,”21 for the superstitious station of a vaporous heaven shall soon be replaced by the certified science of the digital cloud. Yet as the poet Robert Frost warned, “unless you are at home in the metaphor… you are not safe anywhere.… You are not safe with science; you are not safe in history.”22 Although our comprehension of the world is deeply dependent on metaphorical reasoning, each likeness we fashion is always imperfect; each is shadowed by a set of unlikenesses that, if left unacknowledged, can lead our thinking dangerously astray. And, as Marshall McLuhan observed, this is especially true of pervasive technologies, which, through their habitual use, can recast both our thoughts and actions after their 34

T H E N E W I M M O RTA L I S T S / B O S W O RT H

own imagery. Instead of the masters of our new machines, we tend to become, in McLuhan’s view, their “servomechanisms”;23 we see the world through their looking glass, darkly—an observation confirmed daily by millions of smartphone users who, as if expecting a summons from the president, submissively stop whatever they are doing to open each incoming text. So it is that in a society dominated by finance and marketing, people “shop” for a church, worry about their personal “brand,” and measure the final meaning of any event by its “bottom line.” Likewise, in an age when computers have become, for many, the primary means for personal as well as commercial communications, the metaphors that calibrate Kurzweil’s mind (hard drives, software, data retrieval) have also realigned the collective common sense. But just as in Newton’s era the universe was not really a clockwork, and in Freud’s the psyche was not a steam engine, people today are neither corporate brands nor personal computers, and as that old master of metaphorical thinking warned us, it isn’t safe—ethically, psychologically, or scientifically—to believe otherwise. Kurzweil and his fellow cyber-utopians, such as Hans Moravec of Carnegie Mellon University,24 need to deny the ultimate unlikenesses between the machines they design and the nature they aim to emulate—they need to believe their machines can become conscious—because those unlikenesses evoke the unknown and so, too, the uncontrollable. For such thinkers, the uncontrollable (as epitomized by death) is always “suboptimal,” which is why biology for them must not only be improved on but finally transcended, For cyber-utopians, the evolution itself fully replaced by human invention. So it is that a project that begins with coduncontrollable (death) is always dling the flesh, as evidenced in Kurzweil’s own suboptimal, which is why biology fanatical devotion to supplements during the first stage of his Fantastic Voyage, is completmust not only be improved on but ed, ironically, by its obsolescence—that is, by finally transcended, evolution itself replacing our bodies with a new-and-improved series of material “substrates,” our digital selves fully replaced by human invention. free to haunt the cyber-engineer’s new and ever better series of robots. It does make sense, then, that, despite his adoration of science, Kurzweil objects to being called a materialist, preferring the term “patternist” instead.25 Reduced to a servomechanism of the profession he practices, his thinking can conceive of the self only as a software program’s digital pattern. For him, obsessing about the health of his body is just a strategic phase on the way to escaping it entirely, to replacing its physical reality with his virtual reality, where, he imagines, he can “engineer what [he] wants”—which is nothing less than that ultimate boost in individual status, the remaking of himself into a god. And, in the end, this gnostic-like flight from the limits of the flesh becomes a total escape from nature as well. Expanding “at least at the speed of light,” our post-Singularity intelligence will, Kurzweil insists, saturate the whole universe, in which case there will be no more outer space or mysterious wilderness, no otherness at all to ponder or probe, or to challenge by contrast who we are and what we ought to do. In Kurzweil’s dream, the cyber-amplified “mind” not only becomes (as Milton’s hell-bound Satan 35

THE HEDGEHOG REVIEW / SUMMER 2015

desperately boasts) “its own place”;26 it becomes every place. As is commonly the case for captains obsessed with control, the final port of call for his Fantastic Voyage is a state of solipsism. Even if his utopian science were feasible—and the unlikenesses in the metaphor he rides (yea, even unto infinity) discount that possibility—we would have to ask if such a condition is finally desirable, or if instead the paradise he pitches is just the punitive loneliness of the mythic Narcissus inflated to fit an astronomical scale. In The Confidence-Man, Herman Melville coined the perfect term for American modernity’s descent into folly while blindly following the doctrine of progress: “genial misanthropy.” Melville saw that in a society increasingly dominated by chipper salesmen and can-do engineers, the constant boosting of the next sure bet, free lunch, or final cure for pain and death cloaked a fear and loathing of the human condition as it actually is. As is commonly the case for Material confidence was being mustered to mask a metaphysical cowardice, providing a way to captains obsessed with control, the dodge those final questions of meaning and purfinal port of call for his Fantastic pose that our mortality imposes, and inducing in their place an undue hope, which could then be Voyage is a state of solipsism. exploited to close the sale on a whole series of the con man’s dubious wares. Normally, I am not given to prophetic utterances, preferring to cite instead, as an antidote to arrogance, the Japanese aphorism “One inch ahead/the whole world/is dark.”27 In an era of hype, though, sometimes the obvious needs to be said. And so I will end with the following set of unexceptional predictions: Everyone reading this essay will die, as will Ray Kurzweil, as will I, as will eventually our entire species, men and women, enemies and friends, predators and prodigies alike. It is almost certainly the case that the average duration of our stay here will be further extended by the science of healing, for which we should be grateful. But “time and chance” will still “happen to us all.”28 And if a beloved father is to be resurrected, whether Ray’s or mine, it won’t be by our hands, for death and risk aren’t just technical bugs in our biological system but fundamental features of reality’s existence. The final escape so confidently pitched by these new immortalists is at its core a fear-driven form of psychological denial and, as such, a betrayal of the gift of consciousness itself. The Fantastic Voyage they promise us, whether in their deep-freeze vats or nutritional labs, is just one more chapter in our passage on Melville’s “ship of fools.”29

Endnotes

36

1

“Why Choose Cryonics?” Cryonics Institute; http://www.cryonics.org/.

2

“Man Sues to Allow Freezing of Head before He Dies,” United Press International, May 2, 1990. See also Louis Sahagun and T.W. McGarry, “Investigators Believe Woman Was Dead before Decapitation,” Los Angeles Times, January 15, 1988.

3

“Report Says Williams’ DNA Missing,” Associated Press, August 13, 2003.

T H E N E W I M M O RTA L I S T S / B O S W O RT H

4

William Irwin Thompson, The American Replacement of Nature (New York: Doubleday, 1991), 124.

5

After the fall of the Berlin Wall, the American political scientist Francis Fukuyama argued that with the triumph of the free-market democracies over communism, the perennial problem of political governance had been solved, and we had reached the “end of history.” See Francis Fukuyama, The End of History and the Last Man (New York: Free Press, 1992).

6

Ray Kurzweil, The Singularity Is Near: When Humans Transcend Biology (New York: Viking, 2005), 371.

7

Quoted in Ray Kurzweil and Terry Grossman, Fantastic Voyage: Live Long Enough to Live Forever (Emmaus, PA: Rodale, 2004), 14.

8

Dan Ferber, “An Essential Step Toward Printing Living Tissues,” February 19, 2014; http://www.seas. harvard.edu/news/2014/02/essential-step-toward-printing-living-tissues.

9

Kurzweil and Grossman, Fantastic Voyage, 14.

10 See

online companion to Fantastic Voyage, “A Short Guide to a Long Life”; http://www.fantastic-voyage. net/ShortGuide.htm. Accessed March 11, 2015.

11 Ibid., 12 Ray

139–145.

Kurzweil, How to Create a Mind (New York: Viking, 2012), 209–10.

13 Kurzweil 14 Ibid.,

and Grossman, Fantastic Voyage, 123.

9.

15 Kurzweil,

The Singularity Is Near, 9.

16 Ibid.,

352.

17 Ibid.,

259.

18 Ibid.,

364.

19 Ibid.,

375.

20 David 21 John

Kushner, “When Man & Machine Merge,” Rolling Stone, February 19, 2009, 57–61.

Donne, Holy Sonnets, Number 6/10 (Oxford: Oxford University Press, 1990).

22 Robert Frost, “Education by Poetry,” Amherst Graduates’ Quarterly, February 1931: “What I am pointing

out is that unless you are at home in the metaphor, unless you have had your proper poetical education in the metaphor, you are not safe anywhere. Because you are not at ease with figurative values: you don’t know the metaphor in its strength and its weakness. You don’t know how far you may expect to ride it and when it may break down with you. You are not safe with science; you are not safe in history”; http:// www.en.utexas.edu/amlit/amlitprivate/scans/edbypo.html.

23 Marshall

McLuhan, Understanding Media: The Extensions of Man (New York: New American Library, 1964), 51–56.

24 See, for example, Hans Moravec, Mind Children: The Future of Robot and Human Intelligence (Cambridge,

MA: Harvard University Press, 1988).

25 Kurzweil, 26 John

27 W.S.

The Singularity Is Near, 4.

Milton, Paradise Lost, Book I, lines 250–255 (New York: Penguin, 1998). …hail

horrors, hail Infernal world, and thou profoundest Hell Receive thy new possessor: one who brings A mind not to be chang’d by place or time. The mind is its own place, and in itself Can make a Heav’n of Hell, a Hell of Heav’n.

Merwin, Asian Figures (New York: Atheneum, 1973), 44.

28 “I

returned, and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favor to men of skill; but time and chance happeneth to them all,” Ecclesiastes 9:11, King James Version.

29 Herman Melville,

The Confidence-Man: His Masquerade (New York: Grove Press, 1954), 25. Originally published 1857. An early skeptic in The Confidence-Man berates the passengers on Melville’s steamship for their willingness to believe in the con man’s initial scheme: “You flock of fools, under this captain of fools, on this ship of fools!”

37

Body and Soul Mark Edmundson

D

oes the body still exist if we do not have souls? It may sound like a flippant question—or at least a sophistic one. I intend it as neither: Does the body still exist if we do not have souls? That we do not have souls is a palpable fact to many—I would even say most— members of the educated classes in the West today. They don’t go to church. They don’t spend a lot of time thinking about a possible life to come. They don’t bother themselves terribly about the matter of God. But no one denies that we possess bodies. We are all flesh and blood and bones. This much is common knowledge and beyond any real dispute. We eat and drink and sleep and copulate—and now, of course, we exercise. We have a pulse and a blood pressure. We live and then we die. But what happens to this body when there is no soul? What happens when we conceive of existence in a way that is no longer dialectical? Over centuries and centuries—beginning long before Christianity acquired its own stand on the question—we believed that we possessed a dual identity. We have been two entities: body and soul. Often—one may say almost always—we considered these two forms of being to be in tension. They were, first, of a different nature. The body perished. The soul was eternal. The body died and decomposed, while the soul returned home to God, or flew to the other place. The body was mortal, the soul eternal. And the body and soul, needless to say, were often in conflict. The body demanded carnal satisfactions, which were, at least in many traditions, antithetical to the health of the soul. Gluttony and lust and sloth: Three of the seven deadly sins involve excesses of the body. The individual striving to achieve salvation, or simply to live a righteous life, needed to defend against the tendency of his body to betray his hopes.

Mark Edmundson is University Professor at the University of Virginia. His new book, Self and Soul: A Defense of Ideals, will be published this autumn by Harvard University Press. Right: Liz as Cleopatra, 1962, by Andy Warhol (1928–1987); private collection/Bridgeman Images. © 2015 The Andy Warhol Foundation for the Visual Arts, Inc./Artists Rights Society (ARS), New York.

38

39

THE HEDGEHOG REVIEW / SUMMER 2015

Saint Augustine tells us that in the Garden of Eden, Adam was never overcome by lust. Yet he was still in a position to be the father of the human race. Copulation without lust? Sex without forbidden desire? Yes. In the Garden, according to Augustine, Adam willed his erections: They were dependent on the free activity of his rational mind. In Paradise, the body never got in the way of the soul. Until, of course, it did. Aided by Milton, we commonly conceive of the fall of man as an act of bodily indulgence: that gorgeous, lush apple. Pride was at the center of the fall. Milton suggests that Eve and Adam desired to be like gods. But the fact that the expression of that pride was a form of bodily indulgence—surely that matters, too. Body and soul have not been on easy terms down through time. In his poem “Among School Children,” William Butler Yeats imagines a state of being in which “body is not bruised to pleasure soul.” He wants to enjoy the world without outraging the aspirations of his spirit. But the line suggests that Yeats is aware of the tension, knowing it to be endemic to Christianity and much of Western thought. He is, I think, imagining a pagan dispensation (which may or may not ever have existed) in which the demands of the inner life are perfectly compatible with the pleasures of the body. Yeats knows how remote this is from the realm of his inherited Christian culture and (I suspect) from his personal possibilities. Yeats being Yeats, this makes the state in which the body isn’t sacrificed to the longings of the soul—immortal longings, one might imagine—all the more precious. The phrase “immortal longings” is Cleopatra’s from the last act of Shakespeare’s Antony and Cleopatra, when the queen is about to commit suicide rather than fall under the power of the baleful Emperor Augustus. Cleopatra clearly refers to the immortality that comes after a life of doing amazing deeds, the kind of legendary immortality that Achilles sought, Body and soul have not been on and that Shakespeare, among others, confers upon her. Shakespeare’s Cleopatra is not going to a heaven easy terms down through time. of any sort. She faces her death feeling that the life of her spirit and the life of her body have not, overall, been incompatible. Her immortality will come, in some measure, from the gusto with which she indulged her bodily hungers. Before Antony became her lover there was Pompey, before Pompey there was Julius Caesar (“He plough’d her, and she cropp’d”), and how many more besides? Say what you like about Cleopatra as she is rendered by Shakespeare (and history, give or take): Her body was not bruised to pleasure her soul. Her body and her spirit streamed in the same direction: to pleasure and to power, and then on to renown, a renown she still possesses. “Age cannot wither her, nor custom stale her infinite variety.” But to the heaven of the angels, of the thrones, dominations, princedoms, virtues, and powers, Cleopatra, one dares to say, will secure no admittance. It is only through the harsh contentions of body and soul that the Christian spirit in the post-pagan world can achieve salvation in the next world and virtue in this one. Virtue in this one? Yes. Ralph Waldo Emerson speaks of the harsh battle of fate in which souls are born—and in which they grow to full potency: For virtue comes from the Latin word that means strength. Cleopatra’s renown comes from her capacity to live the life of the body to the 40

BODY AND SOUL / EDMUNDSON

full and to provide a beacon for aspirants to extravagant pleasure and sexual delight. The individual who lives out the contention of body and spirit can overcome the bestial part in himself—what Blake and Coleridge thought of as the Natural Man—and achieve something else. Sometimes he achieves self-conquest with an eye to heaven. But sometimes he achieves it for the purposes of earthly life. The thinker and the warrior and the saint struggle against the domination of the body and its baser appetites so that they can realize their full potential on this earth. The thinker seeks truth; the warrior seeks victory; the saint seeks a compassionate world, a world of brothers and sisters. These aspirations may not be consistent with one another: Warriors and saints have their conflicts (although they have some secret affinities, too). But the truth is that these idealists see the hungers of the It is only through the harsh body as impediments to their highest aspirations. Pleasure, enjoyment, even happiness: These are contentions of body and soul states tied to bodily satisfaction, although, abided that the Christian spirit in the in too long, they will lead to inertia. To the idealist, who seeks perfection, the body is both antagonist post-pagan world can achieve and ally. Like that of the believing Christian, his life salvation in the next world and is based upon tension. He lives a fraught dialectic in which the kind of resolution that Yeats entervirtue in this one. tains and that Cleopatra (or at least Shakespeare’s Cleopatra) achieves is radically undesirable. Does the body still exist if we do not have souls? The question now may be a bit more cogent, although still strange enough. What happens, in other words, if the dialectic that has existed for believers and for idealists alike suddenly collapses? If there are no souls, are there still bodies in the conventional sense—the sense that puts the body in tension with the soul? If there are no ideals (and our culture is surely not in love with ideals), what happens to the bodies (or what once were the bodies) of those who might have been idealists? The answer that follows from this line of thinking, provisional though it is, is that our bodies become ourselves. Men and women are no longer and can no longer be what Emerson called us, “a stupendous antagonism.” We are not a dragging together of the poles of the universe for the purposes of salutary struggle. After the collapse of the soul and the collapse of the ideal, after the death of the internal dialectic, what exists? I do not think it is wrong to say that what we are left with, when our bodies become ourselves, is the quest for pleasure. If the body is the only existence (and therefore not quite the body as we knew it before), then we need to gear ourselves to living as enjoyably as possible. The objective of life becomes the avoidance of pain and the stringing together of as many moments of gratification as possible. What else could it be? Surely there will be impediments to this life defined by what we might call the Body Omnipotent (which is no longer the body of old). One will be too sick for enjoyment; one will lack funds; one will not know which pleasure to choose among the many on offer; one’s communications device will malfunction and mislead. No doubt there will be internal impediments to the life of pleasure: Maybe there is a superego after all, but 41

THE HEDGEHOG REVIEW / SUMMER 2015

in time and with the help of various drugs and therapies, we shall overcome it. And, curses upon it, one must have a job. One must labor. But the aim of life becomes clear: It is the utilitarian’s aim of maximizing pleasure and minimizing pain. The Body Omnipotent cannot conceive of anything else. And why should it? The Body Omnipotent will happily accept prosthetic extensions of itself to augment power and pleasure. With the addition of the machine, one can enjoy more—which is to say buy more and experience more. Will there still be a mind? Of course there will: a mind enhanced and enlarged by one electronic device after another. But the mind Men and women are no longer and will not be disposed to think its way clear of the Body Omnipotent. The mind will not can no longer be what Emerson put the body’s hegemony in doubt. The mind called us, “a stupendous antagonism.” will become a tool functioning strategically to deliver as much pleasure as possible to the self. What is the best dinner? Where the best car? How to possess the most lulling vacation? To string together as many beads of pleasure as possible will be to construct a life: this meal, that trip, this dress, that suit. We came, we saw, we enjoyed. Soul departs the world; body disappears to be resurrected as the god of itself. Homer, Plato, and the Gospels recede. Cleopatra is eternal queen.

42

Transition 117 celebrates diasporic vision and creativity with a selection of new poetry and short fiction. It shares the adventurous, the erotic, the audacious—each story embodying “creation working on itself,” states Tope Folarin who introduces the issue. Winner of the 2013 Caine Prize for his story, “Miracle,” Tope writes, “Who knows? In time, you might notice that something new is blooming inside you.” Transition is a unique forum for the freshest, most compelling ideas from and about the black world. Since its founding in Uganda in 1961, the magazine has kept apace of the rapid transformation of the African Diaspora and has remained a leading forum of intellectual debate. Transition is a publication of the Hutchins Center at Harvard University, edited by Alejandro de la Fuente and published three times annually. More about Transition 117: http://hutchinscenter.fas.harvard.edu/transition-117 SUBCRIBE: jstor.org/r/iupress SUBMIT: https://transition.submittable.com/submit Image credit: Umfundi. Afronauts series. Digital C-print. 12 x 12 in. ©2012 Cristina de Middel.

43

On Not Being There The Data-Driven Body at Work and at Play Rebecca Lemov

T

he protagonist of William Gibson’s 2014 science-fiction novel The Peripheral, Flynne Fisher, works remotely in a way that lends a new and fuller sense to that phrase. The novel features a double future: One set of characters inhabits the near future, ten to fifteen years from the present, while another lives seventy years on, after a breakdown of the climate and multiple other systems that has apocalyptically altered human and technological conditions around the world. In that “further future,”1 only 20 percent of the Earth’s human population has survived. Each of these fortunate few is well off and able to live a life transformed by healing nanobots, somaticized e-mail (which delivers messages and calls to the roof of the user’s mouth), quantum computing, and clean energy. For their amusement and profit, certain “hobbyists” in this future have the Borgesian option of cultivating an alternative path in history—it’s called “opening up a stub”—and mining it for information as well as labor. Flynne, the remote worker, lives on one of those paths. A young woman from the American Southeast, possibly Appalachia or the Ozarks, she favors cutoff jeans and resides in a trailer, eking out a living as a for-hire sub playing video games for wealthy aficionados. Recruited by a mysterious entity that is beta-testing drones that are doing “security” in a murky skyscraper in an unnamed city, she thinks at first that she has been taken on to play a kind of video game in simulated reality. As it turns out, she has been employed to work in the future as an “information flow”—low-wage work, though the pay translates to a very high level of remuneration in the place and time in which she lives. What is of particular interest is the fate of Flynne’s body. Before she goes to work she must tend to its basic needs (nutrition and elimination), because during her shift it will effectively be “vacant.” Lying on a bed with a special data-transmitting helmet attached

Rebecca Lemov is associate professor of the history of science at Harvard University and the author of World as Laboratory: Experiments with Mice, Mazes, and Men (2005) and Database of Dreams (forthcoming 2015). She is also a coauthor of How Reason Almost Lost Its Mind: The Strange Career of Rationality in the Cold War (2013). Right: Photograph composition by Henry Sieplinga/HMS Images; Stockbyte/Getty Images.

44

45

THE HEDGEHOG REVIEW / SUMMER 2015

to her head, she will be elsewhere, inhabiting an ambulatory robot carapace—a “peripheral”—built out of bio-flesh that can receive her consciousness. Bodies in this data-driven economic backwater of a future world economy are abandoned for long stretches of time—disposable, cheapened, eerily vacant in the temporary absence of “someone at the helm.” Meanwhile, fleets of built bodies, grown from human DNA, await habitation. Alex Rivera explores similar territory in his Mexican sci-fi film The Sleep Dealer (2008), set in a future world after a wall erected on the US–Mexican border has successfully blocked migrants from entering the United States. Digital networks allow people to connect to The defining feature of this strangers all over the world, fostering fantasies of heavily mediated reality is our physical and emotional connection. At the same time, low-income would-be migrant workers in Tijuana and presence “elsewhere,” a removal elsewhere can opt to do remote work by controlling of at least part of our conscious robots building a skyscraper in a faraway city, locking their bodies into devices that transmit their labor to awareness from wherever our the site. In tank-like warehouses, lined up in rows of bodies happen to be. stalls, they “jack in” by connecting data-transmitting cables to nodes implanted in their arms and backs. Their bodies are in Mexico, but their work is in New York or San Francisco, and while they are plugged in and wearing their remote-viewing spectacles, their limbs move like the appendages of ghostly underwater creatures. Their life force drained by the taxing labor, these “sleep dealers” end up as human discards.

Flickering In and Out What is surprising about these sci-fi conceits, from “transitioning” in The Peripheral to “jacking in” in The Sleep Dealer, is how familiar they seem, or at least how closely they reflect certain aspects of contemporary reality. Almost daily, we encounter people who are there but not there, flickering in and out of what we think of as presence. A growing body of research explores the question of how users interact with their gadgets and media outlets, and how in turn these interactions transform social relationships. The defining feature of this heavily mediated reality is our presence “elsewhere,” a removal of at least part of our conscious awareness from wherever our bodies happen to be. As MIT psychologist Sherry Turkle has shown in pioneering work that extends from The Second Self (1984) to Alone Together (2012), the social ramifications of these new disembodied (or semi-disembodied) arrangements are radical. They introduce a “new kind of intimacy with machines,” a “special relationship” in the space beyond the screen, and a withering away of once-central, physically mediated social bonds. Turkle’s focus, and the focus of much literature on video-game playing and online behavior, is on these engrossing relationships between humans (particularly children) and computers, the social fallout of those relationships, and the resulting effects on self-formation, as hauntingly described in an early work by Turkle on “computer holding power”: 46

ON NOT BEING THERE / LEMOV

The [thirteen-year-old] girl is hunched over the console. When the tension momentarily lets up, she looks up and says, “I hate this game.” And when the game is over she wrings her hands, complaining that her fingers hurt. For all of this, she plays every day “to keep up my strength.” She neither claims nor manifests enjoyment in any simple sense. One is inclined to say she is more “possessed” by the game than playing it.2 The young teens Turkle watched playing Asteroids and Space Invaders are now in their mid-forties, and the dynamic of absorption, tension, possession, and disappearance is, of course, no longer confined to games. Much discussion of data-gathering technologies in daily domains focuses on their inescapability, as Tom McCarthy recently pointed out: “Every website that you visit, each keystroke and click-through are archived: even if you’ve hit delete or empty trash it’s still there, lodged within some data fold or enclave, some occluded-yet-retrievable avenue of circuitry.”3

Self-Knowledge Through Numbers But seemingly undaunted by the extent to which we are now routinely subjected to the data gathering of others, many people are now driven to accumulate endless quantities of data about themselves, their bodies, their activities, their moods, even their thoughts and reveries. The “most connected human on earth,” Chris Dancy, a former information tech“I was coming slightly unhinged nology specialist who took to gathering data about himself after being laid off from his job, bills himwith the amount of information self as a “Data Exhaust Cartographer,” “The Versace I had about myself. It started to of Silicon Valley,” and “Cyborg.”4 He bedecks his body with myriad wearables and promotes himself make me feel slightly detached as the locus of up to 700 devices or online servicfrom reality.” es that collect, crunch, save, and collate the data he generates. The metrics he tracks include pulse, REM sleep, skin temperature, and mood, among others. Perhaps not surprisingly, all of this self-tracking eventually led Dancy to a crisis of alienation. He became increasingly aware that his intense connection was also a form of disconnection: “I was coming slightly unhinged with the amount of information I had about myself. It started to make me feel slightly detached from reality.” As a result, he says, he was “almost waterboarded with awareness. It’s one thing to Google yourself. It’s another to Google…your life. I could see too much.”5 Despite his discomfort, Dancy seems unable to disconnect, unhook, or go offline. He is not alone. The focus of much recent interest in the entrance of tracking technology, counting devices, and calculation strategies into the domain of self-understanding is the Quantified Self (QS) movement. Founded in 2007 through the efforts of Kevin Kelly, then of Wired magazine, and Gary Wolf, a Bay Area writer, the movement brought together self-trackers ranging from the ardent to the merely curious. Under the banner 47

THE HEDGEHOG REVIEW / SUMMER 2015

of “self-knowledge through numbers”—those numbers gathered through biometrics, sociometrics, and psychometrics—enthusiasts combine platforms and tools to find new ways of gathering data and teasing out correlations. “Once you know the facts, you can live by them” is another guiding principle of the movement, and QS-ers continue to form groups across the United States and in thirty other countries, meeting weekly to share results. Critics clearly saw the self-tracking During the week of March 15, 2015, for example, groups came together in London, Washington, St. obsession as navel-gazing, inwardLouis, Denton, Texas, and Thessaloniki, Greece. turning, computer-oriented geek Typically, such gatherings report on their tracking of a range of phenomena from the mundane (cups behavior, typical of those who of coffee drunk per day, pulse rate, sleep hours) have lost contact with the external to the more esoteric (“spiritual well-being,” scores on personality tests or a “narcissism index,” or a world, with other human beings, repository of “all the ideas I’ve had since 1984”) via and with “what matters,” in their devices that might be attached to the wrist (Fitbit), the lower back (UpRight), the chest (Spire), or eagerness to render the world eating utensils (HAPIfork), if not stowed away in knowable, computable. one’s pockets (as smartphone apps). The movement marked its arrival in the cultural mainstream with the publication in 2010 of Wolf’s manifesto, “The Data-Driven Life,” in the New York Times Magazine. His fascination with the obsessively self-regarding project came through most clearly in his example of the tracker who had kept all of his ideas for the past several decades: Mark Carranza—[who] makes his living with computers—has been keeping a detailed, searchable archive of all the ideas he has had since he was 21. That was in 1984. I realize that this seems impossible. But I have seen his archive, with its million plus entries, and observed him using it.… Most thoughts are tagged with date, time, and location. What for other people is an inchoate flow of mental life is broken up into elements and cross-referenced.6 Wolf went on to describe how numbers inexorably enter the domain of the personal, insisting that no place should be considered sacrosanct or beyond the probing sensors of quantification. Wolf was so surprised, he later told me, by the contempt and mockery he and his fellow self-trackers came in for after the article appeared that he almost came to regret writing it. Much of the online comment focused on the atrophied selves and dehumanizing effects seemingly produced by the self-tracking enterprise: “These unfortunate people spend so much time with computers they have begun thinking about their own person as a machine,” wrote one reader. Another comment was even more barbed: “This tracking seems like taking self-centeredness to the nth degree. It is basically OCD [obsessivecompulsive disorder] behavior with a fancy title. How about ‘tracking outward,’ seeing 48

ON NOT BEING THERE / LEMOV

how much time we spend on being with and helping others? Perhaps that is the secret to a longer, healthier life.” Those and other critics clearly saw the self-tracking obsession as navel-gazing, inwardturning, computer-oriented geek behavior, typical of those who have lost contact with the external world, with other human beings, and with “what matters,” in their eagerness to render the world knowable, computable. At stake, it seems, is nothing less than the transformation or deformation of the “human.” Writing in the monthly magazine Prospect, literature scholar and psychoanalyst Josh Cohen raised the pertinent question: Sifting through the talks, blog posts, and articles daily uploaded by Quantified Self disciples, you soon become aware of an anxious insistence on numbers as a means rather than an end. All this data is meant to spur us to love ourselves better and run our lives more efficiently. And yet it’s hard not to hear, lurking in this promise of self-possession, the threat of numbers dispossessing us, of becoming a feverish addiction we can’t kick. Can even the most adept multi-tasker really live the life they’re simultaneously tracking?7 Other critics see the QS movement as part of information technology’s more widespread induction of people into “a perpetual state of shallow performativity.”8 What is neglected or bracketed in both the criticism and the celebration of self-tracking is the curious status of the body that serves as the passively patient platform for a self’s “remote” activity or as the hooked-up object of endless measurement and observation—or indeed as both. Critics and enthusiasts of this strange reality both neglect the peculiar Möbius-strip form taken by the body as the increasingly phantom-like self flickers in and out of its confines. The status of the body that holds these devices, the body as platform—the body that What is neglected or bracketed in is vacated—is curiously invisible.

both the criticism and the celebration

Clickworkers, Gold Farmers, Porn Zappers

of self-tracking is the curious status of the body that serves as the

Where the body can be seen, I believe, is in the passively patient platform for a self ’s menial, low-wage, data-driven labor that is created “remote” activity or as the hooked-up at the downtrodden edges of expanding economies where virtual domains meet brick-and-mortar object of endless measurement and enterprises. One clear picture of the simultaneously observation—or indeed as both. abandoned and surveilled body emerges in research on the most menial work: collective labor markets harnessing human computing abilities. “Clickwork” is the mass labor of many hands on many keyboards, their collective output aggregated by means of Internet tools such as Amazon Mechanical Turk. Through AMT, individuals and businesses (known as Requesters) can crowdsource complex tasks that computer intelligence is currently unequipped to complete. Amazon and other companies cannot afford to regulate 49

THE HEDGEHOG REVIEW / SUMMER 2015

this labor through traditional means; instead, administrators filter it through “light” automated management rather than top-down, heavy-handed control. Microwork ethnographer Lilly Irani describes how, for example, management of a work force of 10,000 to 60,000 for a particular project can never affordably be handled by means of Foucauldian “disciplinary” techniques, which carefully mold individual workers physically and mentally for their tasks. Rather, management must operate automatically, with a light touch: Instead of using surveillance to assess performance, “requesters sort desirable workers through faint signals of mouse clicks, text typed, and other digital traces read closely as potential indicators.”9 Most often, workers work alone at home on their own computers. Repetitive work in the virtual sphere, in addition to being isolating, often necessitates less attention to bodily postures and needs, and may promote ongoing abuse of the body by motivating the worker to conform to algorithmically defined productivity goals that affect the body at its performance limits. Two examples are China-based World of Warcraft “gold farmers” and content moderators in the Philippines who zap porn and disturbing images from social media sites for cash. Many of them based in suburban Manila in former elementary schools and other unlikely sites, the content moderators perform the unsavory job of repeatedly adjudicating whether images posted to Twitter feeds, Facebook pages, or other social networking sites are sufficiently offensive to be eliminated from view. Moderators at PCs sit at long tables for hours, an “army of workers employed to soak up the worst of humanity in order to Repetitive work in the virtual protect the rest of us.” By some estimates, the content-moderating army is 100,000 strong, sphere, in addition to being twice the size of Google’s labor pool, and many of isolating, often necessitates less its members have college degrees. Such workers suffer both physical pain and psychological disattention to bodily postures and tress. Jane Stevenson, of the British organization needs, and may promote ongoing Workplace Wellbeing, which supports traumaabuse of the body by motivating tized workers in high-pressure digital jobs, says that even after a worker has quit such a job, he the worker to conform to or she may continue to be haunted by disturbing algorithmically defined productivity images. Looking for hours at YouTube videos of unspeakable abuse, many become paranoid and goals that affect the body at its uneasy about leaving their children with sitters.10 performance limits. In a profile of other potentially abusive digital-work environments, technology writer Julian Dibbell emphasizes their “surreal” quality. 11 One such workplace is that of Chinese “gold farmers,” who participate in multiplayer online role-playing games, known as MMOs. In these games, which can involve thousands of participants, players advance by earning extra powers and levels of play not only through hard hours at the keyboard but also (particularly among Europeans and Americans who can pay real money for virtual gold or game goods) by buying them online. In the early years of MMOs, these transactions took place on eBay, but now 50

ON NOT BEING THERE / LEMOV

there are “high-volume online specialty sites like the virtual-money superstores IGE, BroGame, and Massive Online Gaming Sales—multimillion-dollar businesses [that] offer one-stop, one-click shopping and instant delivery of in-game cash.” Gold farmers work shifts in “sweatshops,” advancing through MMOs so that richer players can jump effortlessly to higher levels. Such digital toil is not so different from that of Chinese laborers who work long hours to produce cheap real-world products for the global market. Yet the alienation of the body is perhaps more extreme because it is more unaccounted for. Dibbell describes the common condition of such laborers, exemplified by the routine of one particular gold farmer: Consider, for example, a typical interlude in the workday of the 21-yearold gold farmer Min Qinghai. Min spends most of his time within the confines of a former manufacturing space 200 miles south of Nanjing in the midsize city of Jinhua. He works two floors below the plywood bunks of the workers’ dorm where he sleeps. In two years of 84-hour farming weeks, he has rarely stepped outside for longer than it takes to eat a meal. But he has died more times than he can count. And last September on a warm afternoon, halfway between his lunch and dinner breaks, it was happening again.12 What was happening again was that Min was being “exterminated” online, within the confines of the game. Although the Chinese gold farmers sit relatively motionless in their rows of chairs facing screens in nondescript rooms, they are frequently subjected to targeted “kills” by Western players who, playing purely for “fun,” regard the Chinese players as mercenaries. Each time they “die” (in World of Warcraft or other games), their pace of play slows down and they lose money rebooting their characters.13 Art imitates life in The Peripheral, where Flynne Fisher does online gaming for hire and endures similar abuse: A rich man who played the game himself instead of outsourcing got a charge from killing the avatars of people like Flynne because “it really cost them.… People on her squad were feeding their children with what they earned playing, and maybe that was all they had.”14 Economic inequality, whether in fictional 2030 or actual 2015, plays out in online spaces and even extends to forced labor. Forbes magazine recently reported that Chinese prisons forced inmates to gold-farm in twelve-hour shifts without pay.15 The coercive element highlights arrangements that also exist in the putatively voluntary forms of loot farming. Extensive digital tracking of workplace activity adds to bodily stresses. An American Management Association survey found that 66 percent of US-based employers monitor the Internet use of their employees, 45 percent track employee keystrokes, and 43 percent monitor employee e-mail. UPS uses a system, Kronos, under which each of its delivery trucks is equipped with 200 sensors, which feed information back to headquarters about driving speed, seatbelt use, and delivery efficiency. Even trying to cheat the system can hurt the worker. Drivers commonly evade the seatbelt sensor by keeping the seatbelt locked but not strapping themselves in. UPS can claim higher safety compliance even though workers are actually more endangered. A driver recently described cutting corners, slapping delivery 51

THE HEDGEHOG REVIEW / SUMMER 2015

slips on doors, and sprinting from site to site to keep up with impossibly demanding quotas. (After eight years, he sustained such extensive spinal damage that his doctor told him it would be impossible to treat.)16 Work-force management systems such as Kronos and “enterprise social” platforms like Microsoft’s Yammer, Salesforce’s Chatter, and (coming soon) Facebook at Work operate on similar principles of efficiency and maximization.17 With the emergence of “flexible,” short-term regimes of service-based labor and the eclipse of social welfare programs, the self can increasingly be seen as an entrepreneurial project and a risk-taking device.18 If the self is risk taking, the body is risk absorbing. New labor forces of clickworkers, gold farmers, porn zappers, Starbucks flextimers, and Amazon warehouse fulfillers bring to light more clearly the consequences of both the abandonment and extreme monitoring of the body. At the same time, because such workers are often desperate for work and less picky about conditions, they are less likely to incorporate practices or technologies that are becoming increasingly common among upper management as methods of counteracting the physical toll of excessive “screen time” and “chair time”: exercise regimes such as yoga or extreme fitness, or office equipment such as “stand-up desks.”19

The Detachable Body and the Mobile Self Despite the fact that the vacant or overly tracked body is increasingly the condition of people at play, and (especially) at certain kinds of repetitive and exploitative work, the body remains in the background of our awareness. It is perhaps no coincidence that both Gibson and Rivera focus on situations of wrenching economic inequality across globalized domains of capital transfer. At this level and scale of human activity, the strangeness of bodily conditions becomes more obvious through a kind of exaggeration that de-familiarizes what we have come to take for granted. It is not that we are completely Dystopian fiction only amplifies unaware of real stories of warehouse workers in companies such as Amazon who are digitally and catalogs the indignities of tracked and prodded as they go about the work existing dehumanizing practices. of fulfilling online orders. Dystopian fiction only amplifies and catalogs the indignities of existing dehumanizing practices. “They’re Watching You at Work,” declares an Atlantic headline, while a public radio report on “the data-driven workplace of the future” describes employees who ruin their bodies keeping up with “telematic” surveillance devices that track every keystroke they make, every latte they whip, every package they deliver.20 These conditions are moving inexorably up the corporate ladder and economic strata, even as they dissolve ordinary hierarchies. The Quantified Self, once seen as a respite from work, is arriving in the workplace in the form of perpetual self- and managementimposed surveillance. (As mentioned above, the quality of this surveillance is “lighter” and more flexible than traditional panoptical oversight.) Among higher-wage workers, the Quantified Self at work takes the form of socially networked goal setting, in which workers prod each other or companies target “millennials” (who are thought to respond 52

ON NOT BEING THERE / LEMOV

Playing World of Warcraft, Gamescom 2011 in Cologne; © Ina Fassbender/Reuters/Corbis.

more readily to these new forms of what could be called cheerful tracking). Santa Monica-based Enkata, for example, a human resources firm that hires out data-driven platforms to prod claims and sales workers into higher productivity, explicitly eschews keystroke monitoring in favor of “meaningful data” and “predictive analytics to help all members of the sales organization work smarter and close more deals.”21 The body of today’s digitally driven worker evokes those images of bodies stored in suspended animation in various movies from the 1970s onward, including Coma (1978) and Altered States (1980). A number of films explored the horrifying possibility of human bodies being used as food (Soylent Green, 1973), batteries (The Matrix, 1999), or intelligence systems drained in the process of use (Minority Report, 2002). By contrast, James Cameron’s Avatar (2009), itself the product of the work of thousands of animators and digital engineers, strikes a more hopeful, even utopian note. It is a film in which a disabled vet is enabled by technological prostheses to inhabit a mythical world while leaving his wired-up body behind. The shocking vulnerability of his temporarily discarded physical form becomes all too evident in the climactic battle, but the hero ultimately prevails by sundering the connection to the body and living on in the fantastic realm of the Na’vi, who have their own, organic way of “plugging in”—inserting their braids into the neural cords of horse-like animals and operating them through their thoughts. To be more fully human, or post-human, will mean finding a new way of plugging in, jacking in, or transmitting. Whether fantastic or horrifying, the picture presented in these films is that of a body increasingly detached—or made detachable—from a mobile self. This body, because of the systemic shocks it bears, its use as a platform or a source of energy, and even its inescapable 53

THE HEDGEHOG REVIEW / SUMMER 2015

mortality, exemplifies what political scientist Timothy Pachirat in Every Twelve Seconds, an ethnographic study of work in a Nebraska slaughterhouse, calls the politics of sight. Exploring the conditions of a low-wage job typically sought by criminals, undocumented workers, or other desperate souls, Pachirat finds that the slaughterhouse operates to make the repetitive acts of killing—which take place “every twelve seconds,” hence the title—invisible even to 99 percent of those who work there. Only one out of 280 workers is responsible for firing the fatal shot into the head of the cow. The shooter’s work is visible to only one or two others on the killing floor, and he becomes the subject of mythology throughout the abattoir. (A common rumor is that the shooter undergoes constant psychotherapy to fend off work-produced psychosis.) Workers stationed throughout the rendering process, To be more fully human, or who spend hours each day repetitively detaching the limbs and extracting the livers and other viscera of the post-human, will mean finding recently executed creatures, suffer difficult work cona new way of plugging in, ditions and marginalization that mirror the unseen jacking in, or transmitting. suffering of the slaughtered animals. In the end, Pachirat argues, this cultivated “invisibility” (which, in a sense, is the main service offered by the modern slaughterhouse and its disassembly lines) is supremely necessary social and political labor. It allows most people in the “outside world” to act without knowing the consequences, to consume without knowing the cost, and to benefit from others’ work without knowing the source. It is, in fact, on such exquisitely chosen “invisibilities” that the collective delusions and collusions of the modern economy run, particularly as that economy merges with the virtual realm. To extend the analogy, just as there is a public need for packaged meat that does not bear the evidence of its origins or even of the fact that it once lived, there is likewise a public desire for products (be they iPhones or UPS packages) that sleekly obscure the conditions under which they were made or made possible. Pachirat tells of “work that remains hidden from the majority of those who literally feed off such labor.”22 As literary scholar Katherine Hayles recently remarked, the body “has an inability to lie” in the way thoughts can and do: “This is exactly what consciousness lacks.”23 The body offers a kind of resistance and testimony to realities that some would like us simply to ignore. We need to heed the body.

Endnotes 1

54

Gibson describes it in an interview with Karin L. Kross posted at Tor.com, “William Gibson on Urbanism, Science Fiction, and Why The Peripheral Weirded Him Out,” October 29, 2014; http:// www.tor.com/blogs/2014/10/william-gibson-the-peripheral-interview. Spoiler alert: please skip the next three paragraphs if you would rather not know some of the plot details of The Peripheral.

ON NOT BEING THERE / LEMOV

2

Sherry Turkle, “Video Games and Computer Holding Power,” The New Media Reader (Cambridge, MA: MIT Press, 2003), 500. Originally published 1984; http://www.newmediareader.com/book_samples/ nmr-34-turkle.pdf.

3

Tom McCarthy, “The death of writing—if James Joyce were alive today he’d be working for Google,” The Guardian.com, March 7, 2015; http://www.theguardian.com/books/2015/mar/07/tom-mccarthy-deathwriting-james-joyce-working-google.

4

Chris Dancy website; http://www.chrisdancy.com. Accessed April 24, 2015.

5

Ibid.

6

Gary Wolf, “The Data-Driven Life,” New York Times Magazine, April 28, 2010; http://www.nytimes. com/2010/05/02/magazine/02self-measurement-t.html?_r=0.

7

Josh Cohen, “Quantified Self: The Algorithm of Life,” Prospect, February 5, 2014; http://www.prospectmagazine.co.uk/arts-and-books/quantified-self-the-algorithm-of-life.

8

Dennis Tenen, “Writing Technology,” Public Books blog; http://www.publicbooks.org/fiction/writingtechnology. Accessed April 24, 2015.

9

Lilly Irani, “Microworking the Crowd,” Limn, no. 2, March 2012; http://limn.it/microworking-thecrowd/.

10 Adrien

Chen, “The Workers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed,” Wired, October 23, 2014; http://www.wired.com/2014/10/content-moderation/.

11 Julian Dibbell, “The Life of the Chinese Gold Farmer,” New York Times Magazine, June 17, 2007; http://

www.nytimes.com/2007/06/17/magazine/17lootfarmers-t.html?pagewanted=all.

12 Ibid.

Dibbell adds that “Min would like to explain to ‘real’ players that he is playing for different stakes: ‘I have this idea in mind that regular players should understand that people do different things in the game,’ he said. ‘They are playing. And we are making a living.’”

13 Ibid. 14 William

Gibson, The Peripheral (New York: G.P. Putnam Sons, 2014), Chapter 13.

15 Paul

Tassi, “Chinese Prisoners Forced to Farm World of Warcraft Gold,” Forbes, June 2, 2011; http:// www.forbes.com/sites/insertcoin/2011/06/02/chinese-prisoners-forced-to-farm-world-of-warcraft-gold/. An estimated 80 percent of all gold farmers are in China, which, according to the CIA World Factbook, has the largest population of Internet users in the world. China is thought to be home to 100,000 fulltime gold farmers.

16 The

UPS monitoring system is described by Esther Kaplan in “The Spy Who Fired Me: The Human Costs of Workplace Monitoring,” Harper’s Magazine, March 2015; http://harpers.org/archive/2015/03/ the-spy-who-fired-me/.

17 On

the “actuarial self ” and “responsibilization,” see Nikolas Rose, Inventing Our Selves, Chapter 7, “Governing Enterprising Individuals” (Cambridge: Cambridge University Press, 1998).

18 On

the calculation of risk as it relates to the definition of self, see eds. Limor Darash and Paul Rabinow Modes of Uncertainty: Anthropological Cases (Chicago: University of Chicago, in press).

19 Evgeny

Morozov, “The Mindfulness Racket,” New Republic, February 23, 2014; http://www.newrepublic.com/article/116618/technologys-mindfulness-racket.

20 Kai

Ryssdal [interviewer], “The Data-Driven Workplace of the Future” Marketplace [radio broadcast], March 3, 2015; http://www.marketplace.org/topics/business/data-driven-workplace-future.

21 “Know

More Close More,” Enkata website; http://www.enkata.com. Accessed April 14, 2015.

22 Timothy Pachirat, Every Twelve Seconds: Industrialized Slaughter and the Politics of Sight (New Haven, CT:

Yale University Press, 2011), Chapter IX.

23 Hayles

made this comment at “The Total Archive,” a conference at the Centre for Research in the Arts, Social Sciences and Humanities, Cambridge University, March 19–20, 2015. She has developed these ideas in several publications, in which she figures the body as a site of feedback, not a reified thing. Cf. N. Katherine Hayles, How We Became Posthuman (Chicago: University of Chicago Press, 1999).

55

Lessons from the Ring— Then and Now Gordon Marino

Y

ears ago, I had the honor of interviewing David Mamet, who, in addition to being a fine playwright, is a longtime practitioner of the martial arts. After our conversation, I asked him to give me one piece of advice I might pass along to my students. He said, “Tell them to pick some physical art—ballet, boxing, judo, yoga, whatever—and to stick with it. It will make them feel grounded and better able to deal with adversity and rejection in this world.” By moving your body in a certain way, he was saying, you will shape the way you feel and who you are. Philosophy professors (including me) assume that we learn to negotiate these things only by reflecting on them. It is as though we have become oblivious to the lessons we can learn on the path leading from the body to the brain. Once, I confided about an emotional problem to a yoga teacher. She replied, “The answer to the problem is just to breathe.” At the time, I was deeply and rather unreflectively committed to the belief that it is only by thinking that we can solve problems. The yoga teacher’s words awakened me to something I should have known already. After all, I had been training boxers for decades, learning and imparting some of the lessons Carlo Rotella writes about so eloquently in Cut Time: An Education at the Fights: The deeper you go into the fights, the more you may discover about things that would seem at first blush to have nothing to do with boxing. Lessons in spacing and leverage, or in holding part of oneself in reserve even when hotly engaged, are lessons not only in how one boxer reckons

Gordon Marino is a professor of philosophy and director of the Howard and Edna Hong Kierkegaard Library at St. Olaf College, Northfield, Minnesota. The author of Kierkegaard in the Present Age and the editor, most recently, of The Quotable Kierkegaard, Marino covers boxing for the Wall Street Journal and Ring Magazine. He has trained professional and amateur boxers for thirty years. Right: Photograph by Mike Powell, Allsport Concepts; Getty Images.

56

57

THE HEDGEHOG REVIEW / SUMMER 2015

with another but also in how one person reckons with another. The fights teach many such lessons…about getting hurt and getting old, about distance and intimacy…boxing conducts an endless workshop in the teaching and learning of knowledge with consequences.1 In the sweat-and-blood parlor of the boxing ring, young people deal with feelings they seldom get controlled practice with, such as anxiety and anger. And make no mistake— the kind of people we become is largely determined by the way we negotiate those dreadnought emotions. Many of us come into this world with a surplus of anger. I used to work in a therapeutic capacity with emotionally troubled children. One young fellow was seething with a rage he seldom directly expressed. I got the idea of putting on the gloves with him and letting him knock me around. In our sessions, he discovered that his bottledup anger was not necessarily lethal. He could vent his destructive urges, and no one was going to die or get seriously hurt. Coming to terms with our most basic instincts is another thing we learn through boxing. One of the hardest lessons, for instance, is learning how to counter an incoming right hand. When a fist is flying at your face, there is a powerful, natural impulse to pull away. However, when you retreat, your chin often ends up meeting your opponent’s When I catch a boxer obeying punch at the point where it’s most powerful, and you are no longer in any position to deliver instinct and pulling away, I bellow, a counterpunch. “You’re in the ring with fear now— To help them overcome this reflex, I have my boxers stand with their lead foot about beat it down!” eighteen inches away from their sparring partner. The boxer on offense fires a one-two (a straight right jab), and the one on defense has to block or elude the fusillade, but is forbidden to go back. This drill forces boxers to concentrate on standing their ground. When I catch a boxer obeying instinct and pulling away, I bellow, “You’re in the ring with fear now—beat it down!” Whether or not learning to stand your place in the path of pain can help one hold one’s ground in fighting injustice, I can’t be sure. But it is hard to lead a righteous life when we quiver before the possibility of taking a hit. Nelson Mandela understood this, and rigorously trained as a boxer with the conscious purpose of strengthening his mind and will. Forgive me for what might sound like stereotyping, but at least at this time, some of the lessons learned through boxing are quite different for women. When they begin training, many women won’t extend their fist to land a punch. They pull back before the fist reaches its target. That is how deeply the inhibitions against violence and hurting are embedded. On the other hand, there was a woman I once worked with who, when slipping on the gloves for the first time, could not help exclaiming, “This feels cool!” “What?” I asked. “Making a gloved fist!” she said. 58

LESSONS FROM THE RING / MARINO

Woman and punching bag by Alfonse Pagano; Photolibrary/Getty Images.

The mere feeling of being ready to punch was exhilarating to her. But most of boxing’s lessons apply equally to women and men. Søren Kierkegaard once described anxiety as a “sympathetic antipathy or antipathetic sympathy”—a simultaneous attraction and repulsion. And so it goes with many who take up the sport. They come for a few weeks, train, get a punch in the nose, and disappear. A few months later, one of them will call and say, “I want to start training again.” They crave that grounded, at-home-in-oneself feeling toward which Mamet was gesturing. Women’s presence in the ring is obviously a relatively recent development. But I would say that it is only one of the things that have changed in the world of boxing since I began working and training in gyms in the early 1970s. As with everything in life, some of those changes are for the better, some for the worse, and some are simply mixed. Taken as a whole, they reflect what I think are changing attitudes toward the body and the ends to which it may be put, whether for wisdom, self-understanding, or, more recently, a kind of self-reinvention. 59

THE HEDGEHOG REVIEW / SUMMER 2015

It used to be that there were boxing gyms in almost every neighborhood of our major cities, as well as selected centers of rural life. They were dusky caves where men got together. The elders, often former fighters, traded stories, and the younger guys traded blows. When I was in my early twenties, I trained in Gramercy Park Gym, at 116 East Fourteenth Street in Manhattan. A few years before my time, world champions Floyd Patterson and José Torres had practiced their craft there, under the tutelage of Constantine “Cus” D’Amato, the legendary trainer who would become the mentor and, ultimately, stepfather to Mike Tyson. The clientele at the Gramercy was a mix of working-class guys, some cops, and a few fellows the cops might find themselves chasing down an alley on any given night. It was intimidating to tromp up the three flights of stairs and into this dark and stinking den filled with experienced pugilists, many with faces reshaped by the torrent of blows that had landed on them. But if you showed up on a regular basis and were able to give and take a punch, you discovered a level of mutual respect, friendship, and affection that So why did the hedge-fund managers was hard to find elsewhere. end up at Gleason’s Gym? Fourteenth Street was one of the mean streets, and the Gramercy was an institution, mainly for those who had aspirations to fighting under the klieg lights. But there are places where boxing can serve an even more vital role. In depressed and crime-ridden neighborhoods, these halls of limited warfare are often the closest thing to a safe haven. In his landmark study of the culture of the boxing gym in late-twentieth-century South Side Chicago, Body & Soul: Notebooks of an Apprentice Boxer, French sociologist Loïc Wacquant wrote: Above all, the gym protects one from the street, and acts as a buffer against the insecurity of the neighborhood and the pressures of everyday life. In the manner of a sanctuary, it offers a cosseted space, closed and reserved, where one can, among like-minded others, shelter oneself from the ordinary miseries of an all-too-ordinary life and from the spells that the culture and economy of the street hold in store for young men trapped into this place scorned and abandoned by all that is the dark ghetto.2 But these sanctuaries came upon hard times. In the 1980s, when rents and insurance premiums in New York and other cities began to soar, many of the local gyms were forced to turn out the lights. The seven or so dollars a month such places typically charged as dues was not enough to cover the rent, and most students of the “sweet science” could not afford to pay the kind of membership fees that would keep gyms solvent. Fortunately, though, not all of them went under, thanks to the creativity of certain gym owners. As rents and other expenses rose, these owners, starting in New York City, began catering to a well-to-do crowd of corporate executives willing to pay for private and semi-private lessons. This new twist was called “white-collar boxing.” In 60

LESSONS FROM THE RING / MARINO

her superb book Come Out Swinging: The Changing World of Boxing in Gleason’s Gym, Lucia Trimbur has described the changes that took place: This phenomenon…began in the mid-1980s in New York City when a number of white male businessmen, lawyers, and doctors expressed eagerness to pay substantial sums of money to be trained in the city’s most famous gyms…. Gyms quickly instituted white-collar classes, programs, and leagues, and, at a time when the number of amateur and professional boxers in New York City dwindled, the number of whitecollar clients expanded dramatically, keeping urban gyms afloat with their dependable membership dues.3 This white-collar movement swept both the United States and Britain, and although boxing gyms lost much of their old mystique, they were at least able to stay open. In addition, professional and amateur boxers who had often been without work now found gainful employment training the uptown aspirants. When boxing was in its heyday, there was competitive collegiate boxing, but no business executive would have thought of training at a place like the Gramercy. It was widely believed that serious boxers—that is, those who aimed to improve the lives of their families with their fists—needed to be hungry, and the hungrier the better. So why did the hedge-fund managers end up at Gleason’s Gym? The desire for an The boxing gym is also a space where enhanced sense of masculinity was at least one factor. (As Trimbur observes, “Clients repeated physical confrontation can be obsessed with the perception that their breaks down stereotypes, fears, and wealth has made them weak.”)4 But there was also a race-related fantasy at work in the physical boundaries. white-collar turn to boxing. To be sure, race, ethnicity, and nationality have always played a role in the world of boxing. In the early and mid-twentieth century, bouts were promoted on the basis of barely sublimated ethnic rivalries: Italians versus Jews, Irishmen versus Italians. Fighters changed their names in order to appeal to the right ethnicities. Even today, some of the most lucrative matchups are between Mexicans and Puerto Ricans. But race is perhaps the most complicated dimension of gym life. Commenting on its role in the white-collar movement, Trimbur puts it bluntly: When upper-middle-class and upper-class white professionals pay for the expertise of “authentic” black trainers, they are imagining and consuming a notion of blackness defined by the body, narratives of suffering, histories of criminality, and experiences of racial inequality. Clients presume an authentic black identity, and, in turn, produce a form of black masculinity.5

61

THE HEDGEHOG REVIEW / SUMMER 2015

The boxing gym, however, is also a space where repeated physical confrontation breaks down stereotypes, fears, and physical boundaries. I can recall an amateur bout between two thirteen-year-old kids, one white and one black. They had never met before and were from different worlds. For three rounds, they tried to decapitate each other. After the decision was announced, they were hanging all over each other like old friends. Later in the evening, I saw them in the parking lot exchanging phone numbers. However much it is exploited to promote boxers and the big fights, racism is rare among boxers themselves. Today, men and women with money not only can travel to exotic lands; they can purchase experiences that lead to new versions of themselves. Yes, people have long wanted to know how much they can take or how well they will react to a physical challenge. As the narrator says in Chuck Palahniuk’s novel Fight Club, “If you’ve never been in a fight, you wonder. About getting hurt, about what you’re capable of doing against another man.” But now, for people for whom money is no obstacle, it’s possible to purchase experiences of controlled violence that lead not simply to self-knowledge but to the remaking of the self. A recent issue of Men’s Health—“The Reinvention Issue”—shows a photograph of a highly chiseled and tattooed Justin Bieber. Next to the teen idol is the boldface headline “CAN HE REINVENT HIMSELF?” Mind you, it is the self, not the body, that is at issue. Of course, the implied answer to the question is yes—he can reinvent himself, and you can too, by changing your body. Boxing has become one way to that end. Once a school of hard knocks, the boxing gym has become an arena of self-reinvention. The place may endure, imparting some of the same old lessons, but what many people now make of those lessons is indeed something new and different.

Endnotes

62

1

Carlo Rotella, Cut Time: An Education at the Fights (New York: Houghton Mifflin, 2003), 2.

2

Loïc Wacquant, Body & Soul: Notebooks of an Apprentice Boxer (New York: Oxford University Press, 2004), 14.

3

Lucia Trimbur, Come Out Swinging: The Changing World of Boxing in Gleason’s Gym (Princeton: Princeton University Press, 2013), 118.

4

Ibid., 137.

5

Ibid., 118–119.

Quite simply, the best cultural review in the world.” —John O’Sullivan, National Interest

As a critical periodical The New Criterion is probably more consistently worth reading than any other magazine in English.” —The Times Literary Supplement

June

May

April

March

February

January

December

November

October

September

Subscribe today! 

online: www.newcriterion.com/subscriberservices



by phone: 1-800-783-4903 (US) 1-973-627-5162 (international)

63

The Witness of Literature: A Genealogical Sketch Alan Jacobs

My story is important not because it is mine, God knows, but because if I tell it anything like right, the chances are you will recognize that in many ways it is also yours. Maybe nothing is more important than that we keep track, you and I, of these stories of who we are and where we have come from and the people we have met along the way because it is precisely through these stories in all their particularity, as I have long believed and often said, that God makes Himself known to each of us most powerfully and personally. If this is true, it means that to lose track of our stories is to be profoundly impoverished not only humanly but also spiritually. —Frederick Buechner1 Long ago at Calvin College’s Festival of Faith and Writing—an enormous biennial gathering of writers, would-be writers, and passionate readers, most but not all of them Christians—I had a curious and memorable experience. The featured speaker that year was Frederick Buechner,

a novelist and memoirist whose general fame was greatest at the beginning of his career, in the 1950s, and who, since then, had produced a series of well-reviewed but not especially popular books. His 1981 novel Godric was a finalist for the Pulitzer Prize in fiction; this is as close as he has come to winning a major literary award. Yet among those attending the Festival of Faith and Writing, Frederick Buechner was simply a rock star. My wife and I had known Buechner for many years, and we arranged to meet him for coffee and a talk, before having dinner later in a larger group. But this private meeting proved difficult to arrange. So many people wanted to see him, to thank him, to get him to sign their often-reread copies of his books—it was more than Buechner, or anyone else, could handle, and he had to be kept out of sight. So we were ushered in cloakand-dagger fashion to a small, out-of-the-way room where the author was ensconced, so we could recall old times and catch up a bit. An awkward situation ensued. The kind and efficient people running the festival clearly expected us to have five minutes with the great

Alan Jacobs is a distinguished professor of humanities in the honors program at Baylor University. A prolific essayist, reviewer, and blogger, he is the author, most recently, of “The Book of Common Prayer”: A Biography (2013) and The Pleasures of Reading in an Age of Distraction (2011). Left: Joy Hulga’s Leg by Blair Hobbs; courtesy of the artist. Image inspired by Flannery O’Connor’s short story “Good Country People.”

65

THE HEDGEHOG REVIEW / SUMMER 2015

man and then depart; Buechner equally clearly expected to spend some time chatting. So when another visitor came in and we rose to leave, Buechner insisted that we sit back down. As it turned out, then, we spent most of the afternoon there, having our conversation regularly interrupted by new visitors. Some of these were other festival speakers—for instance, Alfred Corn, the distinguished poet and critic, dropped by, and he and Buechner compared notes for a few moments on shared friends and acquaintances in New York—but most were simply lovers of Buechner’s work who had managed through some means unknown to us to gain brief admission to his presence. And almost all of them told the same story: Your writing has meant everything to my Christian faith. I don’t think I could be a Christian without your books.

How did literary writers come to be seen by many as the best custodians and advocates of Christian faith?

Throughout that afternoon—rising to greet strangers, then sitting down and striving to remain inconspicuous as they poured out their hearts—I couldn’t help reflecting on the sheer oddity of the situation. These were people, by and large, who knew the Bible, who attended church, who had the benefits of Christian community. Yet they testified, almost to a person, that Christian belief would have been impossible for them without the mediation of the stories told by Frederick Buechner. I know literary history fairly well, especially where it intersects with Christian thought and practice, and it seemed to me that such radical dependence on literary experience would have been virtually impossible even a century earlier. But I also knew that Buechner’s role was anything but unique, that other readers would offer the 66

same testimony to the fiction of Walker Percy or Flannery O’Connor or C.S. Lewis. How did such a state of affairs come about? How did literary writers come to be seen by many as the best custodians and advocates of Christian faith? It is a question with a curious and convoluted genealogy, one worth teasing out.

* * * HUMANITIES: grammar, rhetoric, and poetry…for teaching of which, there are professors in the universities of Scotland, called humanists. —Encyclopaedia Britannica (1768) Cicero, in his Pro Archia, refers to the studia humanitatis ac litteratum: humane and literary studies. This phrase caught the eye of some early Renaissance scholars, especially the Tuscan Coluccio Salutati, correspondent of Petrarch, and his student Leonardo Bruni; it encapsulated their understanding of what education at its highest level should be. In the Italian universities of the fifteenth century, one who advocated this model and taught according to it was known as an umanista—an inevitable coinage, since a teacher of jurisprudence had long been known as a jurista, a teacher of canon law a canonista, and so on. So the term humanist, from which humanism in turn derives, was originally the product of student slang. As Paul Oskar Kristeller explained long ago in what remains a useful treatment of the history, in the early modern period and especially in Italy, “the studia humanitatis came to stand for a clearly defined cycle of scholarly disciplines, namely grammar, rhetoric, history, poetry, and moral philosophy,” pursued primarily by reading the greatest Latin writers, though eventually, in a secondary way, the major Greek figures were also included. Other philosophical subdisciplines that had their own professors, such as

T H E W I T N E S S O F L I T E R AT U R E / J A C O B S

logic and metaphysics, played no part in the humanists’ project. The studia humanitatis therefore were “concerned…neither with the classics [as such] nor with philosophy [as such]”; their focus “might roughly be described as literature. It was to this peculiar literary preoccupation that the very intensive and extensive study which the humanists devoted to the Greek and especially to the Latin classics owed its peculiar character, which differentiates it from that of modern classical scholars since the second half of the eighteenth century.”2 It is hard to imagine a scholastic dialectician not being utterly scandalized by Boccaccio’s claim that “theology is nothing else than the poetry of God.”

Kristeller’s use of “peculiar” twice in that last-quoted sentence is a stylistic infelicity, but a telling one. The umanistas were doing something unprecedented in keying the search for wisdom— including, as we shall see, specifically Christian wisdom—to the study of literature. This was, to put the point mildly, not in keeping with the dialectical approach of the medieval scholastic tradition, which they scorned. How this literary approach to the moral and social education of young men emerged is not, I think, perfectly understood, but some of the groundwork for it may have been laid by Boccaccio, writing in his Life of Dante in 1374. In the passage that follows, Boccaccio uses the term “theology” to mean “Holy Scripture”: The subject of sacred poetry is divine truth, while that of the ancient poets is men and the gods of the pagans. They are opposite in so far as theology proposes nothing that is not true; poetry supposes certain things as true which are

most false and erroneous and contrary to the Christian religion.… I say that theology and poetry may be said to be almost one thing when the subject is the same; and I say further that theology is nothing else than the poetry of God.… The sense of our Comedy…whether you call it moral or theological…is, at whatever part of the work most pleases you, the simple and immutable truth, which not only cannot receive corruption, but, the more it is searched, the greater odor of incorruptible sweetness it emits to those who regard it.3 It is hard to imagine a scholastic dialectician not being utterly scandalized by Boccaccio’s claim that “theology is nothing else than the poetry of God”: Literature, and not philosophical theology, becomes the foundational genre of God’s revelation. And of course poetry demands to be read as poetry, not as philosophy. So if Boccaccio is right, then a wholly different intellectual toolbox from that provided by Scholasticism is required for the one who seeks Holy Wisdom. If I am right in thinking that the Life of Dante was a key text in the emergence of literary humanism, it is noteworthy that Boccaccio makes his case for the wisdom-giving power of poetry through a recent text written in the Italian vernacular—after all, this is not the direction the umanistas would take. But soon after he wrote about Dante, Boccaccio worked on a far larger and more ambitious project, the Genealogia deorum gentilium, or Genealogy of the Pagan Gods, in which he showed that the thenstandard model of biblical exegesis—with its identification of four distinct levels of meaning, the literal, moral, allegorical, and anagogical— could usefully be applied by the Christian interpreter to pagan myths. So, for example, the story of how Perseus killed the Gorgon Medusa and then flew away on winged sandals becomes, on the moral level, “a wise man’s triumph over 67

THE HEDGEHOG REVIEW / SUMMER 2015

vice and his attainment of virtue”; on the allegorical level, “the pious man who scorns worldly delight and lifts his mind to heavenly things”; on the anagogical level, “Christ’s victory over the Prince of this world and his Ascension.”4 Boccaccio thus shows in his Life of Dante how a great Christian writer can use poetry to convey to us “the simple and immutable truth,” and in his Genealogy of the Pagan Gods how shrewd Christian readers can use interpretative methods originally developed for the study of the Bible to liberate the wisdom hidden in pagan texts. It is a strategy encompassing the Christian and the non-Christian, the writerly and the readerly.

Gerson was especially frustrated by the scholastic habit of addressing a disputed question merely by piling up citations to authorities, a habit already mocked by his older contemporary Geoffrey Chaucer in The Canterbury Tales.

Boccaccio died in 1375; twenty years later, another important development in this story was marked by the election of thirty-two-year-old Jean Gerson as chancellor of the University of Paris. In a superb work of scholarship on Gerson and—in the words of the book’s subtitle—“the transformation of late medieval learning,” Daniel Hobbins shows how the French academic gradually distanced himself from scholastic dialectical procedure and sketched out a new direction for Christian intellectual writing. Gerson was himself formed by scholastic education, and warmly commended its emphasis on sound logic: “We can never speak truly and properly without correct use of logic.” But he also came to believe that the ways the scholastics deployed their logic had become stultifyingly rigid, and 68

he constantly sought alternative means of organizing and presenting ideas. Some of Gerson’s devices, Hobbins acknowledges, “may seem puzzling or contrived to modern tastes, but they are evidence of a creative mind at work striving for new forms of presentation.” It is noteworthy that in the many dialogues Gerson composed for varying contexts, including even sermons, an especially prominent character is Studiositas speculatrix (Earnest Investigator), whose questions seem never to end. How to satisfy this relentless inquirer—this is a problem Gerson thought the traditional scholastic models failed to solve.5 Gerson was especially frustrated by the scholastic habit of addressing a disputed question merely by piling up citations to authorities, a habit already mocked by his older contemporary Geoffrey Chaucer in The Canterbury Tales—for instance, in the “Nun’s Priest’s Tale,” where Chaunticleer the rooster uses a barrage of references to out-argue his consort, Pertelote: “Oon of the gretteste auctours that men rede/Seith thus.… And certes, in the same book I rede/ Right in the nexte chapitre after this.… Lo, in the lyf of Seint Kenelm I rede.… And forthermoore I pray yow looketh wel/In the olde testament of Daniel.” It is far too easy, Gerson came to believe, for the point of the discourse to be lost in the apparatus. Hobbins again: “Abandoning unnecessary citations and ‘coming to the point and the heart of the matter as it seems to me,’ as he says in a French sermon—this direct and personal approach is perhaps the most distinctive trait of Gerson’s style.”6 For Gerson, the problem was that the scholastics had paid but lip service to rhetoric as one of the foundational disciplines of the artes liberales: “We write, but we give no weight to our sentences, no number and measure to our words. Everything we write is flaccid, coarse, and sluggish. We write not new things but old, and when we try to pass them off as our own by recycling them, we deform them and render them absurd.”7

T H E W I T N E S S O F L I T E R AT U R E / J A C O B S

It might appear that Gerson is merely another of the umanistas discussed earlier, with whom he was roughly contemporary, but he differs from them in certain significant ways. First of all, he retains far more respect for dialectical method and logic than they did, which is precisely why he soon became a marginal figure and was almost forgotten after his death: For the umanistas he was too scholastic, for the schoolmen too humanistic. He tried to be a mediating figure in a time of intellectual war. But second, and more important for my purposes here, Gerson’s approach to rhetoric is not driven by a reverence for the unique greatness of classical authors, but rather by a desire to reach and move his audiences, whether in pastoral or academic contexts. Rhetoric for him is not a matter of conformity to incontrovertible Ciceronian norms, but rather the concern to find words that will stir people’s hearts as well as their minds. Hobbins notes Gerson’s great reverence for the Italian theologian and philosopher Bonaventure, who a century earlier had written in his Collationes in Hexaemeron (Talks on the Six Days of Creation) that “not through hearing alone but through heeding [observando] is one made wise.”8 If we combine Boccaccio’s insistence on the theological power of poetry with Gerson’s desire to find a rhetoric that will move hearers and readers toward godly obedience, we end up with something like the argument Sir Philip Sidney makes in one of the great documents of the English Renaissance, An Apologie for Poetrie (c.1579)—which nevertheless is to be distinguished from those predecessors in important ways as well. At a crucial stage of his argument, Sidney places his apologia within a general account of the purposes of education: This purifying of wit, this enriching of memory, enabling of judgment, and enlarging of conceit, which commonly we call learning, under what name soever it come forth, or to what

Rhetoricians at a Window, c.1661–1666 by Jan Steen (1626–1679); The Philadelphia Museum of Art/Art Resource, NY.

immediate end soever it be directed, the final end is, to lead and draw us to as high a perfection, as our degenerate souls made worse by their clayey lodgings, can be capable of. What studies can bring about the “perfection” of which Sidney writes, can help our souls acquire “true virtue”? Sidney answers that though “some…thought this felicity principally to be gotten by knowledge,” they disputed which knowledge was the most valuable: astronomy, natural philosophy, metaphysics, or music. Sidney rejects all of these as “but serving sciences,” only truly useful if they “serve” the greater end, “the mistress knowledge, by the Greeks called architektonike, which stands, as I think, in the knowledge of a man’s self, in the ethic and politic consideration, with the end of well-doing, and not of well-knowing only.” In the art of moving men to “well-doing,” “the poet is worthy to have it before any other competitors.” 69

THE HEDGEHOG REVIEW / SUMMER 2015

Here is a strong commendation, but again, although Sidney’s identification of literature’s power to move its audience to “virtuous action” echoes some of the beliefs of Boccaccio and Gerson, surely both of them would have found his formulation woefully lacking in theological specificity. And so as Renaissance humanism comes into its pedagogical and literary maturity, it simultaneously sheds its distinctively Christian character; it seeks a language that transcends—or, some might say, evades—theological detail. Surely this was a natural enough response to a century that had seen the rise of violent religious controversy that would not subside for many more decades. In the transition from Boccaccio to Gerson to Sidney, we see an intellectually powerful, literarily sophisticated Christian humanism arise, only to clip its own wings lest it contribute to a continent’s discord.

world through disobedience. Sidney’s model of education is ethical and in a certain sense spiritual, but there is nothing specifically Christian about it, and it may even run counter to Christian orthodoxy, even though Sidney himself was an earnest Christian. So in the transition from Boccaccio to Gerson to Sidney, we see an intellectually powerful, literarily sophisticated Christian humanism arise, only to clip its own wings lest it contribute to a continent’s discord. Milton’s rearguard action, in his verse as well as his prose—Paradise Lost would form the last great monument of early modern Christian humanism—had little chance of achieving great influence. Paradise Lost is as theologically and biblically specific a poem as one can imagine, but its great fame in the two centuries following its publication in 1667 was perpetuated by many a poet who either ignored Milton’s theology or, as William Blake famously did—“Milton was of the Devil’s party without knowing it”—set it at odds with the poem. * * *

This process of moving beyond—or away from—theology was not smooth and unruffled. Almost a century after Sidney, John Milton would strive for a more distinctively Christian account of what education should do: “The end then of Learning is to repair the ruins of our first Parents by regaining to know God aright, and out of that knowledge to love him, to imitate him, to be like him, as we may the nearest by possessing our souls of true virtue, which being united to the heavenly grace of faith makes up the highest perfection” (Of Education, 1644). It seems likely that Milton’s definition is meant to extend and correct Sidney’s: While Sidney speaks of “degenerate souls” in “clayey lodgings”—a formulation more Platonic than Christian— Milton identifies our problem in the specific terms of the biblical narrative, according to which Adam and Eve brought “ruin” into the 70

There in the desert I lay dead, And God called out to me and said: “Rise, prophet, rise, and hear, and see, And let my works be seen and heard By all who turn aside from me, And burn them with my fiery word.” —Alexander Pushkin, “The Prophet” (translation by D.M. Thomas)9 It would seem, then, that the story of Christian humanism was effectively over, especially given the general decline of orthodox (or even unorthodox) Christian belief among the learned in the eighteenth and nineteenth centuries. But with the advent of Romanticism, matters took an interesting turn, not because the Romantics were (by and large) either Christian or humanist in the senses in which I have been using those terms, but because they provided, whether they meant

T H E W I T N E S S O F L I T E R AT U R E / J A C O B S

to or not, a new way in which a reconstituted humanism, simultaneously Christian and literary, could reform itself. The “way” I speak of here is a literary one, but it is not the only such point of reentry. A distinctively philosophical Christian humanism also arises in the nineteenth century, largely at the instigation of Pope Leo XIII in his 1879 encyclical Aeterni Patris, which led to the enthronement of Thomas Aquinas as a model of Christian thought and the subsequent claim, made by many Catholic thinkers, most notably Jacques Maritain, that Christianity is the only genuine humanism. That new philosophical humanism overlaps with the literary one to some degree— certainly Maritain tries to draw them together, as does Flannery O’Connor in her frequent references to Aquinas in her correspondence—but, practically speaking, they develop as largely separate trends.

A distinctively philosophical Christian humanism also arises in the nineteenth century, largely at the instigation of Pope Leo XIII in his 1879 encyclical Aeterni Patris.

The renewal of a literary Christian humanism is illustrated by a highly representative Victorian book, The Autobiography of Mark Rutherford (1882). Recounting an episode from his university days, the author writes, But one day in my third year, a day I remember as well as Paul must have remembered afterwards the day on which he went to Damascus, I happened to find amongst a parcel of books a volume of poems in paper boards. It was called Lyrical Ballads, and I read first one and then the whole book. It conveyed to me no new doctrine, and yet the change it

wrought in me could only be compared with that which is said to have been wrought on Paul himself by the Divine apparition. “Mark Rutherford” is the pseudonym of William Hale White, an English civil servant who had been raised in a Nonconformist home and studied to become a Congregationalist minister, but abandoned that plan when he lost his faith. It was William Wordsworth who restored that faith—or gave him a new one: White’s rhetoric is richly ambiguous on this point, as we can see by continuing to read from the same passage: God is nowhere formally deposed, and Wordsworth would have been the last man to say that he had lost his faith in the God of his fathers. But his real God is not the God of the Church, but the God of the hills, the abstraction Nature, and to this my reverence was transferred. Instead of an object of worship which was altogether artificial, remote, never coming into genuine contact with me, I had now one which I thought to be real, one in which literally I could live and move and have my being, an actual fact present before my eyes. God was brought from that heaven of the books, and dwelt on the downs in the far-away distances, and in every cloud-shadow which wandered across the valley. Wordsworth unconsciously did for me what every religious reformer has done—he re-created my Supreme Divinity; substituting a new and living spirit for the old deity, once alive, but gradually hardened into an idol. On the one hand, White wants to say that Wordsworth gave him “no new doctrine”; on the other hand, he claims to have had a dramatic road-to-Damascus conversion from “the God 71

THE HEDGEHOG REVIEW / SUMMER 2015

of the Church” to “the God of the hills.” The God of the Church had become an idol; now, thanks to the poetry of Wordsworth, White has a living God to whom he can give true worship. This is a step significantly greater than the one White’s older contemporary John Stuart Mill took when he supplemented the dry rationalism of his Utilitarian upbringing with the reading of Wordsworth’s poems: That had merely been, as Mill put it in his Autobiography (1873), “a medicine for my state of mind” insofar as those poems “expressed, not mere outward beauty, but states of feeling, and of thought coloured by feeling, under the excitement of beauty. They seemed to be the very culture of the feelings, which I was in quest of.” Mill takes pains to insist that this encounter with Wordsworth did not change in any fundamental way his commitments or practices: “I never turned recreant to intellectual culture, or ceased to consider the power and practice of analysis as an essential condition both of individual and of social improvement. But I thought that it had consequences which required to be corrected, by joining other kinds of cultivation with it. The maintenance of a due balance among the faculties now seemed to be of primary importance.”10 White, by contrast, was given not just a renewed “culture of the feelings,” but a divinity whom he could truly worship. Coleridge’s linkage of poetry and divinity was often accepted in his time and after, but his orthodoxy was generally deemed optional or undesirable.

It is noteworthy that White’s mediator is Wordsworth, because Wordsworth’s friend and sometime collaborator Samuel Taylor Coleridge had also articulated, in his Biographia Literaria, a theological defense of poetry—but had done so in a more emphatically orthodox manner, 72

in implicit and sometimes explicit dissent from Wordsworthian devotion. When Coleridge writes, “The primary IMAGINATION I hold to be the living Power and prime Agent of all human Perception, and as a repetition in the finite mind of the eternal act of creation in the infinite I AM,” he is drawing a very clear line between poetic making and the biblical God—the one who says to Moses, “I AM THAT I AM”—not “the God of the hills.”11 Coleridge’s linkage of poetry and divinity was often accepted in his time and after, but his orthodoxy was generally deemed optional or undesirable. In light of this history, we might compare White’s self-accounting to one made some years later by William Butler Yeats, who wrote, “I am very religious, and deprived by Huxley and Tyndall, whom I detested, of the simple-minded religion of my childhood, I had made a new religion, almost an infallible church, out of poetic tradition: a fardel of stories, and of personages, and of emotions, a bundle of images and of masks passed on from generation to generation by poets and painters with some help from philosophers and theologians.” The dogma of this church Yeats states in these terms: “Because those imaginary people are created out of the deepest instinct of man, to be his measure and his norm, whatever I can imagine those mouths speaking may be the nearest I can go to truth.”12 Thomas Henry Huxley and John Tyndall are often referred to as atheists, inaccurately: Huxley coined the term “agnostic” to describe his position, and Tyndall never specified his religious views. But both of them insisted that the claims of religion had to be subordinated to those of science—that only science was productive of knowledge, although religious faith could be a strong support of morals. In any event, to the arguments of Huxley and Tyndall against traditional religion, Yeats had no answer—just as William Hale White had no answer to the doubts that assailed him—until literature and the other arts came to the rescue. But what they rescued

T H E W I T N E S S O F L I T E R AT U R E / J A C O B S

was something very different from Trinitarian Christianity, or any such “simple-minded religion” that a poet might associate with childhood. We have come a long way here from Boccaccio and Gerson and even Sidney, and yet there are genuine continuities to be noted and accounted for. In the aftermath of Romanticism, with its cult of intuition and imagination as reliable pathways to truth, and its skepticism about the desiccating effects of reason as defined by the chief figures of the Enlightenment, we hear an echo of the humanist reinstatement of rhetoric, first as supplemental to and then as superior to dialectic. Only now, it is the aesthetic that complements (in Mill) and then transcends (in White and Yeats) the narrowly rational. The access to religious truth that philosophy and science seal off is re-enabled by aesthetic, and especially literary, experience.

for the novels of Fyodor Dostoevsky—claims that Dostoevsky himself endorsed by linking himself with Pushkin’s poem “The Prophet”— even though the history of Christianity and Christian thought in Russia is so dramatically different from its English counterpart. Thus, Vladimir Solovyev, in a eulogy delivered just after Dostoevsky’s death in 1881, wrote that “just as the highest worldly power somehow or other becomes concentrated in one person, who represents a state, similarly the highest spiritual power in each epoch usually belongs in every people to one man, who more clearly than all grasps the spiritual ideals of mankind, more consciously than all strives to attain them, more strongly than all affects others by his preachments. Such a spiritual leader of the Russian people in recent times was Dostoevsky.”14 The Victorian era was one dominated by sages of all kinds, but this is an especially peculiar one: the poet as prophet, as comforter, as vehicle of the sacred.

To the arguments of Huxley and Tyndall against traditional religion, Yeats had no answer until literature and the other arts came to the rescue.

This development seems to happen throughout Europe in the latter half of the nineteenth century. It is responsible for the founding of Browning Societies even during Robert Browning’s lifetime, based largely on the belief that the true religious spirit might be breathed in, with unique ease and comfort, through poetry. In a typical passage from the early papers of the London Browning Society, an admirer wrote, “I must claim for Browning the distinction of being pre-eminently the greatest Christian poet we have ever had. Not in a narrow dogmatic sense, but as the teacher who is as thrilled-through with all Christian sympathies as with artistic or musical.”13 It is striking how strongly this assertion resembles the claims made in Russia

* * * No eye his future can foretell No law his past explain Whom neither Passion may compel Nor Reason can restrain. —W.H. Auden, libretto to The Rake’s Progress For Victorian intellectuals, the versions of the Christian faith that could not be taken seriously were those of the previous generation: The faith of one’s parents, whether biological, intellectual, or aesthetic, will inevitably seem “simple-minded.” But children eventually become parents. The earlynineteenth-century Russian liberals whom Peter the Great had done so much to create, who looked back with scorn on traditional Orthodoxy, themselves came to seem absurdly naive to Dostoevsky, who satirized them mercilessly, especially in the character of Peter Verkhovensky in Demons (also 73

THE HEDGEHOG REVIEW / SUMMER 2015

known as The Possessed ). Something similar came to befall the English Victorian advocates of an enlightened religion, or no religion at all. Consider, for example, a story related by C.S. Lewis, who lost his faith in early adolescence and then went on to be tutored for some years by “a ‘Rationalist’ of the old, high and dry nineteenth-century type,” a man named Kirkpatrick. “At the time when I knew him, Kirk’s Atheism was chiefly of the anthropological and pessimistic kind. He was great on The Golden Bough and Schopenhauer.”15 But while Lewis was studying with Kirkpatrick, something happened: He started reading the fiction of George MacDonald, who, though a Victorian and not by every standard orthodox in his theology, was nevertheless the kind of person Lewis called “a thoroughgoing supernaturalist.” Lewis’s account of this experience is extremely telling: What Phantastes [MacDonald’s 1858 novel] actually did to me was to convert, even to baptize…my imagination. It did nothing to my intellect nor (at that time) to my conscience. Their turn came far later and with the help of many other books and men. But when the process was complete—by which, of course, I mean “when it had really begun”—I found that I was still with MacDonald and that he had accompanied me all the way and that I was now at last ready to hear from him much that he could not have told me at that first meeting. But in a sense, what he was now telling me was the very same that he had told me from the beginning.16 We might usefully compare this with Lewis’s recollection of encountering, a few years later, the essays of G.K. Chesterton: “I had never heard of him and had no idea of what he stood for; nor can I quite understand why he made such an immediate conquest of me,” since, Lewis wrote, “my 74

pessimism, my atheism, and my hatred of sentiment [should] have made him to me the least congenial of all authors.” But without agreeing with Chesterton (“I did not need to accept what Chesterton said in order to enjoy it”), the young Lewis was “charmed” by him. He concludes this account by commenting, “In reading Chesterton, as in reading MacDonald, I did not know what I was letting myself in for.”17 What Lewis had acquired from MacDonald was not knowledge of or even mere knowledge about Christian belief and practice, but, rather, a disposition to openness, a willingness to be charmed.

To grasp the import of this statement, one must take it literally: Lewis did not know what he was letting himself in for. What he had acquired was not knowledge of or even mere knowledge about Christian belief and practice, but, rather, a disposition to openness, a willingness to be charmed. This is why he says that reading MacDonald baptized his imagination: Baptism is the Christian rite of initiation, the beginning of new life in Christ, and, in Lewis’s Anglican tradition, something that typically happens to infants. Through MacDonald, he had been initiated into habits of aesthetic experience that would later make him receptive to Chesterton for reasons he could not then have stated: Only later (“when it had really begun”) would come the knowledge that enabled him to give an account of what had happened to him when he read those books. A quarter-century after Lewis underwent his imaginative baptism, a Frenchwoman of Jewish parentage but no religious upbringing would have an experience that bears notable structural similarities to his, but through an encounter with a seventeenth-century poet rather than a Victorian writer of fantasy. In 1942, Simone Weil

T H E W I T N E S S O F L I T E R AT U R E / J A C O B S

wrote to the priest who had become her informal counselor—informal because she refused to be received into the Catholic Church—that in 1938 she had spent Passion Week at the monastery of Solesmes, where she attended services every day. There, she met a young Englishman whose face seemed to register some extraordinary experience when he received Communion, and who “told me of the existence of those English poets of the seventeenth century who are named metaphysical.” George Herbert’s “Love III”—a kind of allegory of the Lord’s Table—struck her with particular force, and she memorized the poem. Weil told her counselor, Father Perrin, that “often, at the culminating point of a violent headache, I make myself say it over, concentrating all my attention upon it and clinging with all my soul to the tenderness it enshrines.” But she did not know what she was letting herself in for by reciting such a poem. “I used to think I was merely reciting it as a beautiful poem, but without my knowing it the recitation had the virtue of a prayer. It was during one of these recitations that, as I told you, Christ himself came down and took possession of me.”18 Here again, the imaginative or aesthetic experience precedes and paves the way for intellectual understanding. The way of reason is not rejected—indeed, the opposite is true—but the reason has to be released from its bondage in order to function properly. To borrow language from the philosopher Charles Taylor in A Secular Age (2007), the buffered self must become in some respect porous, and the most vulnerable buffer is the one that protects the imagination. To use the language of “imagination” is to employ a post-Romantic concept, one that Boccaccio or Gerson or Sidney would have found baffling. (In early modern translations of the Bible, the word “imagination” is, without exception, used in a highly pejorative way: “Yet they obeyed not, nor inclined their ear, but walked every one in the imagination of their evil heart,” from Jeremiah 11:8, is typical.) The

twentieth-century British thinker Owen Barfield best understood the recuperation and elevation of the term: Its rise marks “the transition from a view of art which beholds it as the product of a mind, or spirit, not possessed by the individual, but rather possessing him; to a view of it as the product of something in a manner possessed by the individual though still not identical with his everyday personality.”19 My imagination, then, is not identical with my conscious mind—it works in some sense on its own, independent of my volition—but it does not come from without, it does not and cannot possess or (in older senses of the term) inspire. But it is precisely because imagination does not present itself as transcendent, and therefore does not put up the buffers, raise the shields, that it can become a vehicle of the transcendent. It constructs a back door to God.

To use the language of “imagination” is to employ a post-Romantic concept, one that Boccaccio or Gerson or Sidney would have found baffling.

I should note that imagination creates this door for readers more than for writers, at least in some cases. Writers may indeed dissent from the model of reception I have described. Flannery O’Connor, a lifelong Catholic rather than a convert or returnee like Weil, Lewis, and Coleridge, placed dogmatic belief front and center in her own thinking: “Your beliefs will be the light by which you see, but they will not be what you see and they will not be a substitute for seeing”20—a point of view rather different from the one that makes imagination the light by which dogma is seen, and recognized as desirable. But the key point for the reader is not how the writer sees but that the writer sees. All those who are led to and strengthened in religious faith by writers must believe that writers have, at the very least, superior powers of perception 75

THE HEDGEHOG REVIEW / SUMMER 2015

enabled by superior imagination. Percy Shelley’s claim that “poets are the hierophants of an unapprehended inspiration” merely states in extravagant terms what all believers in the salvific witness of literature must affirm. Percy Shelley’s claim that “poets are the hierophants of an unapprehended inspiration” merely states in extravagant terms what all believers in the salvific witness of literature must affirm.

For this reason, it is typically sufficient that the writer reveal the conditions against which he or she dare not and need not preach. O’Connor again: “We are now living in an age which doubts both fact and value. It is the life of this age that we wish to see and judge.”21 Likewise, Walker Percy, a physician by training, found commonality between the doctor and the writer in the act of diagnosis: “To the degree that a society has been overtaken by a sense of malaise rather than exuberance, by fragmentation rather than wholeness, the vocation of the artist, whether novelist, poet, playwright, filmmaker, can perhaps be said to come that much closer to that of the diagnostician rather than the artist’s celebration of life in a triumphant age.”22 The diagnostic novelist or poet—Auden comes to mind as a poet with this kind of forensic and etiological temperament— certainly “judges,” but judges by portrayal and implication. And this is certainly for the best, if one would avoid triggering the powerful buffers and shields of modernity. “Tell the truth but tell it slant,” Emily Dickinson famously counseled, and throughout the long and meandering history of Christian humanism we see an increasingly strong preference, among a certain kind of reader, for the slanted truth. A defense of the powers of poetry (what we would call “literature”) to carry Christian 76

truth faithfully and vividly—a defense that arose in a period when dialectical method dominated European universities—underwent a series of transformations that seemed, at one point, to culminate in the victory of a Wordsworthian “God of the hills,” a deity composed wholly of affect. At the end of the Victorian era, few could have imagined that in the next century literature would become, for many readers, not just the preferred but the only vehicle for conveying and commending a strongly traditional form of Christianity. But that is precisely what occurred. When institutional Christianity came increasingly to be despised, when preaching acquired a largely negative connotation, stories and poems took both their places. Perhaps no one from Boccaccio to William Hale White would have known quite what to make of this. I confess that I myself do not know quite what to make of it, especially since I do not see any obvious heirs to Buechner—who himself is both less orthodox and far less popular than Lewis, O’Connor, or Percy. Perhaps the kind of thing I witnessed that day at Calvin College—Your writing has meant everything to my Christian faith. I don’t think I could be a Christian without your books—will prove to have been merely a local and temporary phenomenon, a curious sideshow in twentiethcentury Western Christianity. I hope not.

Endnotes 1

Frederick Buechner, Telling Secrets (San Francisco: HarperSanFrancisco, 1991), 30.

2

Paul Oscar Kristeller, Renaissance Thought: The Classic, Scholastic, and Humanist Strains (New York: Harper, 1961), Chapter 1.

3

Boccaccio, Life of Dante, trans. G.R. Carpenter (New York: Grolier Club, 1900), 142.

4

Boccaccio on Poetry: Being the Preface and the Fourteenth and Fifteenth Books of Boccaccio’s Genealogia Deorum Gentilium, trans. and ed. Charles G. Osgood (Indianapolis: Library of Liberal Arts, 1956).

T H E W I T N E S S O F L I T E R AT U R E / J A C O B S

Daniel Hobbins, Authorship and Publicity before Print: Jean Gerson and the Transformation of Late Medieval Learning (Philadelphia: University of Pennsylvania Press, 2009), 106.

14 Quoted

6

Ibid., 109.

7

Ibid., 120.

16 C.S.

8

Ibid., 120.

9

Cited as the epigraph to Joseph Frank, Dostoevsky: The Mantle of the Prophet, 1871–1881 (Princeton: Princeton University Press, 2003).

5

15 C.S.

Lewis, Surprised by Joy: The Shape of My Early Life (New York: Houghton Mifflin Harcourt, 1955), 139. Lewis, Preface to George MacDonald: An Anthology (San Francisco: HarperSanFrancisco, 2001). Originally published 1946, xxxviii.

10 John

Stuart Mill, Autobiography (1873), Chapter V; http://www.gutenberg.org/cache/epub/10378/pg10378. html. Accessed March 23, 2015.

11 Coleridge,

in Joseph Frank, Dostoevsky: The Mantle of the Prophet, 756.

Biographia Literaria (1817), Chapter 13.

12 The

Collected Works of W.B. Yeats Vol. III: Autobiographies, eds. William H. O’Donnell and Douglas N. Archibald (New York: Simon and Schuster, 2010).

13 The

Browning Society’s Papers, Parts 1–3 (London: Browning Society, 1881); http://books.google.com/ books?id=sWc4AAAAYAAJ. Accessed March 23, 2015.

17 Lewis,

Surprised by Joy, 191.

18 Simone

Weil, “Spiritual Autobiography,” in Waiting for God, trans. Emma Craufurd (New York: Harper Perennial, 2001), 27. Originally published 1951.

19 Owen

Barfield, Speaker’s Meaning (Middletown, CT: Weslayan University Press, 1967); https://www.google.com/ search?sourceid=navclient&ie=UTF-8&rlz=1T4MXGB_ enUS524US554&q=Owen+Barfield+.

20 Flannery

O’Connor, Mystery and Manners: Occasional Prose (New York: Macmillan, 1969), 91.

21 Ibid.,

117.

22 Percy,

“Diagnosing the Modern Malaise,” Signposts in a Strange Land: Essays (New York: Macmillan, 2000), 206. Originally published 1991.

A guide for the perplexed The Hedgehog Review

Read • Think • Subscribe www.hedgehogreview.com

77

78

On the Value of Not Knowing Everything James McWilliams

IN JANUARY 2010, WHILE DRIVING FROM

Chicago to Minneapolis, Sam McNerney played an audiobook and had an epiphany. The book was Jonah Lehrer’s How We Decide, and the epiphany was that consciousness could reside in the brain. The quest for an empirical understanding of consciousness has long preoccupied neurobiologists. But McNerney was no neurobiologist. He was a twenty-year-old philosophy major at Hamilton College. The standard course work— ancient, modern, and contemporary philosophy—enthralled him. But after this drive, after he listened to Lehrer, something changed. “I had to rethink everything I knew about everything,” McNerney said. Lehrer’s publisher later withdrew How We Decide for inaccuracies. But McNerney was mentally galvanized for good reason. He had stumbled upon what philosophers call the “Hard Problem”—the quest to understand the enigma of the gap between mind and body. Intellectually speaking, what McNerney experienced was like

diving for a penny in a pool and coming up with a gold nugget. The philosopher Thomas Nagel drew popular attention to the Hard Problem four decades ago in an influential essay titled “What Is It Like to Be a Bat?” Frustrated with the “recent wave of reductionist euphoria,”1 Nagel challenged the reductive conception of mind—the idea that consciousness resides as a physical reality in the brain—by highlighting the radical subjectivity of experience. His main premise was that “an organism has conscious mental states if and only if there is something that it is like to be that organism.”2 If that idea seems elusive, consider it this way: A bat has consciousness only if there is something that it is like for that bat to be a bat. Sam has consciousness only if there is something it is like for Sam to be Sam. You have consciousness only if there is something that it is like for you to be you (and you know that there is). And here’s the key to all this: Whatever that “like” happens

James McWilliams is a professor at Texas State University and the author of A Revolution in Eating: How the Quest for Food Shaped America and Just Food: Where Locavores Get It Wrong and How We Can Truly Eat Responsibly. His work has appeared in Harper’s, The New York Times, The New Yorker online, and The Paris Review. Left: Stupor Mundi, 2010, by Mimmo Paladino (b.1948); Galleria Nazionale d’Arte Moderna, Rome, Italy/© Stefano Baldini/Bridgeman Images.

79

THE HEDGEHOG REVIEW / SUMMER 2015

to be, according to Nagel, it necessarily defies empirical verification. You can’t put your finger on it. It resists physical accountability. McNerney returned to Hamilton intellectually turbocharged. This was an idea worth pondering. “It took hold of me,” he said. “It chose me—I know you hear that a lot, but that’s how it felt.” He arranged to do research in cognitive science as an independent study project with Russell Marcus, a trusted professor. Marcus let him loose to write what McNerney calls “a seventy-page hodgepodge of psychological research and philosophy and everything in between.” Marcus remembered the project more charitably, as “a huge, ambitious, wide-ranging, smart, and engaging paper.” Once McNerney settled into his research, Marcus added, “it was like he had gone into a phone booth and come out as a super-student.”3

for any narrow craft or profession”— is certainly true of McNerney. Without pretense, he says things such as “I love carrying ideas in my head, turning them over, and looking at them”; “I have my creative process mapped out”; and “I’m currently incubating a hunch and making it whole.” He uses the phrase “connect the dots” quite a bit and routinely evaluates the machinery of his own mind. My favorite example: “I have this insane problem—if I read 300 pages of anything I’ll find something to write about, but the more I read the less novel the idea seems and the less I want to write about it.” Such are the phenomena that preoccupy him. After our first phone conversation, he called me a “new soul mate” before signing off. Ever mindful, ever curious, he is a poster boy for the humanities.

If what it is like to be human, much less a

The Mosh Pit of Thought

bat, turns out to be empirically situated in the dense switchboard of the brain, what happens to Shakespeare, Swift, Woolf, or Wittgenstein when it comes to explaining ourselves to ourselves?

When he graduated in 2011, McNerney was proud. “I pulled it off,” he said about earning a degree in philosophy. Not that he had any hard answers to any big problems, much less the Hard Problem. Not that he had a job. All he knew was that he “wanted to become the best writer and thinker I could be.” So, as one does, he moved to New York City. McNerney is the kind of young scholar adored by the humanities. He’s inquisitive, open-minded, thrilled by the world of ideas, and touched with a tinge of old-school transcendentalism. What Emerson said of Thoreau—“he declined to give up his large ambition of knowledge and action 80

But it’s unclear how much longer the humanities can nurture the Sam McNerneys of the world. Even at Hamilton—a solid liberal arts college—McNerney was, by his own assessment, something of a black sheep. As he indulged the life of the mind, grappling earnestly with timeless philosophical problems, his friends prepared themselves for lucrative careers in law, medicine, and finance. They never criticized his choice— “They never said, ‘You’re going to live a shitty life, Sam,’” he told me—but they didn’t rush to join him in the mosh pit of thought, either. For all of McNerney’s curiosity—one deeply reflective of a humanistic temperament—it has led him headlong into a topic (the Hard Problem) that has the potential to alter permanently the place of the humanities in academic life. If, after all, Nagel is proven wrong—that is, if subjectivity is in fact reducible to an identifiable network of neural synapses—what is the point of investigating the human condition through a humanistic lens? If what it is like

O N T H E VA L U E O F N O T K N OW I N G E V E RY T H I N G / M CW I L L I A M S

Bat and Full Moon by Biho Takahashi (Yoshikuni) (b.1873); © UCL Art Museum, University College London, UK/ Bridgeman Images.

to be human, much less a bat, turns out to be empirically situated in the dense switchboard of the brain, what happens to Shakespeare, Swift, Woolf, or Wittgenstein when it comes to explaining ourselves to ourselves? It’s perhaps because of this concern that Nagel’s famous essay stays famous, playing a rearguard role in philosophy seminars throughout the country. By challenging the very notion of a biological understanding of consciousness, by positing individual consciousness as an existential

reality that defies objectification, “What Is It Like to Be a Bat?” breathes continual life into a phenomenon—the inner subjectivity of experience—that science has yet to illuminate with empirical exactitude. The critic Richard Brody noted as much when he wrote last year in The New Yorker that “the ideas that Nagel unfolds ought to be discussed by non-specialists with an interest in the arts, politics, and—quite literally, in this context—the humanities.” He continued: 81

THE HEDGEHOG REVIEW / SUMMER 2015

If Nagel is right, art itself would no longer be merely the scientist’s leisuretime fulfillment but would be (I think, correctly) recognized as a primary mode of coming to grips with the mental and moral essence of the universe. It would be a key source of the very definition of life. Aesthetics will be propelled to the forefront of philosophy as a crucial part of metaphysical biology…. The very beauty of Nagel’s theory—its power to inspire imagination—counts in its favor.4 And thus Nagel’s essay—and the humanities— abide.

Humanities vs. STEM Behind Brody’s optimism there’s a much-discussed backstory. You’ve heard it repeated like a mantra: The humanities are in crisis. They’re dying. There is a mass exodus of literature and history and anthropology majors. Some say this is overhyped angst. As a history professor who has seen twenty years of change, I disagree. It’s real. A shift is underway in higher education, and that shift, in many ways, mirrors—or at least is a microcosm of—the frenzied quest to solve the mystery of the Hard Problem. The numbers don’t deny it. Nationally, the number of students majoring in the humanities has fallen substantially since 1970.5 At Stanford, 45 percent of the faculty is trained in the humanities, but only 15 percent of students major in humanities fields.6 At Yale, between 1971 and 2013 the proportion of humanities majors dropped from 53 percent to 25 percent among women, and from 37 percent to 21 percent among men. Meanwhile, economics has skyrocketed as a preferred major, with the number of economics majors growing almost threefold at traditionally humanities-inclined institutions such as Brown 82

University.7 The acronym STEM—science, technology, engineering, mathematics—is now part of every university’s lingua franca. It hardly helped the humanities when a 2012 Georgetown University study found that students in non-technical majors had unemployment rates ranging from 8.9 to 11.1 percent, while graduates in engineering, science, education, and health care had an overall unemployment rate of 5.4 percent.8 It is for good reason that the top five majors at Duke are now in biology, public policy, economics, psychology, and engineering.9 English majors must cringe a little bit when they learn that STEM majors make $32,000 more the year they enter “the real world.”10

Perhaps an even deeper reason for the humanities’ shrinking status is the intensification of a certain and perhaps temporary habit of mind among today’s undergraduates: the fierce adherence to quantification.

Plausible explanations for the withering of the humanities run the gamut. Writing in The New Criterion, Mark Bauerlein, a professor of English at Emory University, blames old-school identity politics: “The minute professors started speaking of literary works as second to race and queerness, they set the fields on a path of material decline.”11 (Cranky.) The essayist Arthur Krystal points to the rise of postmodern theory—particularly deconstructionism—as a culprit. After a “defamiliarized zone of symbols and referents” gutted Western thought, he explains, the result was “the expulsion of those ideas that were formerly part of the humanistic charter.”12 (Stodgy.) But by far the loudest and most controversial response to the crisis in the humanities comes from William Deresiewicz. In his recent book Excellent Sheep,

O N T H E VA L U E O F N O T K N OW I N G E V E RY T H I N G / M CW I L L I A M S

he highlights the scourge of “credentialism” among “entitled little shits.” He asks, “Do young people still have the chance, do they give themselves the chance, to experience the power that ideas have to knock you sideways?”13 Run ragged by status-driven career agendas, they seem not to. They seem pre-channeled, alienated from the whole notion of ideas for ideas’ sake. “Nothing in their training,” Deresiewicz writes of his pressure-cooked subjects, “has endowed them with a sense that something larger is at stake.”14 Perhaps an even deeper reason for the humanities’ shrinking status is the intensification of a certain and perhaps temporary habit of mind among today’s undergraduates, a habit that Nagel reminds us has severe limitations: the fierce adherence to quantification. Why this turn has happened at this point in time is difficult to say, but it seems fairly certain that a renewed faith in the power of radical empiricism—not to mention the economic advantages it can confer when judiciously applied post-graduation—has decisively lured students out of the humanities and into fields where the defining questions are reducible to just the facts, thank you.

Empiricism Run Amok? Those left in the wake of this trend scratch their heads and prove the rule. Consider the experience of Logan Sander, a Princeton freshman majoring in comparative literature. In tenth grade, Sander wrote a paper on Kurt Vonnegut’s Slaughterhouse-Five in her English class. As with McNerney, the impact of the experience unexpectedly transformed her ambitions. She quickly fell in love with literature and began to read and study fiction with an inspired urgency, reveling in more questions than answers, seeking insights rather than data. After graduating first in her class at Southview High School in Sylvania, Ohio, Sander was invited to an event celebrating valedictorians

from twenty-five public high schools. There she discovered, a little to her dismay, that she was the only valedictorian planning to pursue a nonscientific field of study. Today, Sander has discovered a dynamic humanistic bubble at Princeton, but still, she observes, “there’s this assumption that if you’re pursuing the humanities you’re not likely to get a good job or make a lot of money.” She remains committed to literature. “Those in it seem to really love what they’re doing,” she told me. Plus, she wondered, “Since when is the value of the humanities based on money?”15 If we go about the business of being ambitious humans armed with an empiricism that grasps and gobbles up and conquers everything up to and including consciousness, it seems reasonable to wonder if we’ll lose something essential to the precarious project of being human— something such as humility.

Of course, there’s nothing inherently wrong with an undergraduate shift toward empiricism. Nor is there a problem with pursuing money. Gathering and mapping and deploying objective data will produce everything from a cure for cancer to an app for identifying the finest coffee within a ten-block radius to the best fertilizer for sub-Saharan farmers (and probably has, as far as I know). The quality of human life will surely improve because of such endeavors, and those pursuing them are bound to live accomplished and well-remunerated lives. The world will be better off for their contributions. But… As Sander says, “The humanities…touch the inner parts of our minds and souls the way technology cannot.” Indeed, what about the inner parts of our minds? Our souls?! What about the shimmering but elusive beauty of subjective 83

THE HEDGEHOG REVIEW / SUMMER 2015

experience? What about those things you can’t measure or convey? When it comes to these questions, it’s worth wondering if empiricism hasn’t run amok in the halls of academe. After all, if we go about the business of being ambitious humans armed with an empiricism that grasps and gobbles up and conquers everything up to and including consciousness, it seems reasonable to wonder if we’ll lose something essential to the precarious project of being human—something such as humility. If nothing else, Nagel’s challenge reaffirms the value of humility. I have no hard proof for this thesis, but I think there’s something to it: Knowing that there are things we don’t know—and may never know—has a humbling effect on the human mind. Humility is a form of modesty that asks us to accept ambiguity. Ambiguity, in turn, is ultimately what brings us together to explore the mysteries of existence through the wonder-driven endeavors we lump under that broad umbrella known as the humanities. If we knew it all, if we understood what it was like to be a bat, probably even Logan Sander would not be a comparative literature major. In a way, to catch consciousness, to close the mind-body gap, would be to eliminate that humility. It would be to answer most of the big questions—to collapse the umbrella and move into a post-human world. And that might sound great to logical positivists and atheists and neurobiologists. But as the essayist Charles D’Ambrosio reminds us, “Answers are the end of speech, not the beginning.”16 Are we really ready to stop talking?

Correlating Consciousness Christof Koch is the chief scientific officer at the Allen Institute for Brain Science, in Seattle. In the world of neurobiology, he’s a big deal. If Nagel led the effort to popularize the Hard Problem of 84

consciousness, Koch leads the effort to solve it. In April 2014, President Obama introduced the US Brain Research Through Advancing Innovative Neurotechnologies (BRAIN) Initiative. With $60 million a year going to the Allen Institute, Koch is poised to play a pivotal role in the scientific effort to locate the mind in neurobiological space.

What Nagel says we’ll never find, Koch, through the swagger of science, insists we’ll own.

Of all the things he said to me in the course of a lively conversation last November, this, delivered with risible enthusiasm, stood out the most: “At some point, you will know what it’s like to be a bat.” Koch’s faith in empiricism is pure. His reality is deeply physical. Driving his neurobiological approach to consciousness is a deep conviction that science, which he calls “humanity’s most reliable, cumulative, and objective method for comprehending reality,” is a project that “should also help us explain the world within us.” What Nagel identifies as the elusive subjectivity of experience, Koch, in his recent book Consciousness: Confessions of a Romantic Reductionist, refers to as “properties of the natural world.” What Nagel says we’ll never find, Koch, through the swagger of science, insists we’ll own. Koch first explored the mind-body problem while doing a postdoctorate at MIT in the early 1980s. It was a time in his life, he explained, “when I was young and brash and naive and didn’t like the idea that something can’t be solved.” When he moved to the California Institute of Technology in 1986, he developed a lifelong intellectual and personal friendship with the biologist Francis Crick (of DNA double-helix fame), a man whom he came to admire as “a reductionist writ large.” Koch came under Crick’s wing and thrived.

O N T H E VA L U E O F N O T K N OW I N G E V E RY T H I N G / M CW I L L I A M S

The two men worked closely together on the Hard Problem, postulating what they called “neural correlates of consciousness.” They defined these as “the minimal neural mechanisms jointly sufficient for any one specific conscious percept.”17 When Crick died in 2004, Koch carried the torch down the path of hard empiricism, more confident than ever that “the weird explanatory gap between physics and consciousness” could be closed to produce to “a complete elucidation of consciousness.”18 Koch’s current working hypothesis is elegant. A précis of it might go something like this: Consciousness begins and ends with neural information. The mind is inextricably bound up with verifiable information zipping through the brain in the form of synaptic liaisons among skittering dendrites. The integration of that information lays the foundation of consciousness. Through an integrated information theory, pioneered by the University of Wisconsin’s Guilio Tononi, one can viably posit a unified consciousness from the seemingly endless causal interactions within the relevant parts of the brain. Because one can theoretically calculate the extent of this integration, one can feasibly identify consciousness. More so: One can also measure it. Koch is quick to caution that this is all a postulate. Nothing has been finalized in the frenzied quest to map out consciousness. “We cannot yet calculate the state of awareness for even the simple roundworm with current computers, let alone deal with the complexity of the human brain,” he writes.19 At this point in time, in short, we still have no idea what it’s like to be a bat.

But that doesn’t mean we someday won’t. What Koch has unequivocally discovered in his impassioned search for an empirical understanding of consciousness is something critical within himself: utter perseverance. He won’t quit. “My aim in life,” he told me, “is to have a grand view of how this all works.” Not to do so, he explained, would be “scandalous.” Evaluating Koch’s mission in light of the humanities, it’s tempting to assess his project in zero-sum terms—that is, as an endeavor that seeks to replace imagination with information, creativity with concreteness, subjectivity with objectivity, the soul with the body. It’s tempting, in other words, to see the BRAIN Initiative as a hubristic endeavor to reduce the beautiful messiness of humanistic creativity to the neatness of a mathematical equation. Koch’s aggressive rhetorical defense of science does little to discourage such a concern. He writes: If we honestly seek a single, rational, and intellectually consistent view of the cosmos and everything in it, we must abandon the classical view of the immortal soul. It is a view that is deeply embedded in our culture; it suffuses our songs, novels, movies, great buildings, public discourse, and our myths. Science has brought us to the end of our childhood. Growing up is unsettling to many people, and unbearable to a few, but we must learn to see the world as it is and not as we want it to be. Once we free ourselves of magical thinking we have a chance of comprehending how we fit into this unfolding universe.20

It’s tempting to see the BRAIN Initiative as a hubristic endeavor to reduce the beautiful messiness of humanistic creativity to the neatness of a mathematical equation.

Magical thinking? This passage is enough to make the humanist cringe. It’s time to grow up? Be done with childish things? We recall our innocent love for the blues of Robert Johnson, and we think, “Leave my songs alone!” We consider how stunned we feel reading the last paragraph 85

THE HEDGEHOG REVIEW / SUMMER 2015

of Cormac McCarthy’s Suttree, and we think, “Don’t touch my novels!” We pause and re-read Mark Strand’s reference to “an understanding that remains unfinished,” and we say, “Stay away from my poetry!” We enter the Rothko Chapel and ride a wave of new-agey spiritualism, and think, “Leave my architecture in peace.” We ponder Nagel’s question, tapping the spirit of the Beatles, and we say to ourselves, “Let it be.” Wonder, in other words, might be integral to the humanistic worldview, but as a state of mind, it’s on the ropes.

But the reductionists are on hand to throw cold water on our mystical musings: The majority of today’s philosophers of mind defend Koch’s empiricism. (“We’d better figure out what it’s like to be a bat,” Georgetown philosophy professor Bryce Huebner told me.) Koch and his colleagues refuse to accept the phrase “This will never be answered.” (Koch told me that skeptics have uttered such nonsense throughout history.) The humanities are slowly ceding ground to the “neuro-humanities” and the “digital humanities.” The federal government is spending hundreds of millions of dollars to map the circuitry of the mind. And the majority of today’s best and brightest undergrads are allegedly “excellent sheep,” moving zombie-like toward STEM-related pursuits and jobs making apps and managing hedge funds. Chances are good, in other words, that nothing will be left alone. And so we have to ask: Will the humanities sink under the weight of science? And if not, how will they respond? What will be their role?

Can Wonder Save the Humanities? The fear pervading the humanities these days evokes what the historian of science Lorraine 86

Daston has recently called “the paradox of wonder.” She describes it in the most eloquent terms: The marvel that stopped us in our tracks—an aurora borealis, cognate words in languages separated by continents and centuries, the peacock’s tail— becomes only an apparent marvel once explained. Aesthetic appreciation may linger…but composure has returned. We are delighted but no longer discombobulated; what was once an earthquake of the soul is subdued into an agreeable frisson.… The more we know, the less we wonder.21 The world’s greatest scientific thinkers have always been a little bamboozled in the face of this relationship. Descartes believed that wonder was a necessary spur to scientific knowledge as well as a seductive stimulus that could turn turbulent and addictive. Bacon similarly understood wonder to be “a dangerous passion,” deeming it a form of “broken knowledge” that should only be sampled in small doses and as a means to fixing that break. Since the days of Descartes and Bacon, the sensation of wonder—at least in the world of contemporary science—has evolved into either an embarrassing illumination of ignorance (think astrology) or an affirmation of an Occam’s razor− style scientific explanation (think natural selection). The generally pejorative connotation of wonder, however, is such that, as Daston explains, “humanists are even more chary [than scientists] of expressing wonder.”22 Wonder, in other words, might be integral to the humanistic worldview, but as a state of mind, it’s on the ropes. Christof Koch, for all his empiricism, suggests how it might make a comeback. Hints that Koch approaches life with a profound attunement to its wondrous manifestations, and that such wonder underscores his science, come through in his personal website. There, you can find an illustrated narrative of long solo hikes on the

O N T H E VA L U E O F N O T K N OW I N G E V E RY T H I N G / M CW I L L I A M S

John Muir Trail, multi-hour trail runs in the San Gabriel Mountains, and grueling rock-climbing excursions in Yosemite Valley. There, you can read dreamy commentary such as “At night, I would contemplate the high alpine sky above me and the moral compass within me” or “I discovered the Zen of marathon running.”23 It’s evident that Koch is keyed in to the subtle tremors of human experience, the kinds that give you butterflies and make your heart race—the kinds that the classical humanist is loathe to reduce to an algorithm. When we spoke, Koch lowered his voice almost to a whisper as he said, “I find myself in this wonderful universe that’s so conducive to life.” Might it not be the very job of the humanities to bring back certain intangibles to the center of humanity and to ask science to broaden its horizons to accommodate mystery?

The subtitle of his book Consciousness confirms this unexpected impulse animating his worldview: Confessions of a Romantic Reductionist. At the end, he poignantly confronts the inevitability of his own death—“the significance of my personal annihilation”—and delivers an assessment that reifies the romantic part of reduction. After “facing down an existential abyss of oblivion and meaningless within me,” he writes, he underwent “an unconscious process of recalibration,” and arrived here: I returned to my basic attitude that all is as it should be. There is no other way than I can describe it: no mountaintop conversion or flash of deep insight, but a sentiment that suffuses my life. I wake up each morning to find myself in a world full of mystery and beauty. And I am profoundly thankful for the wonder of it all.24

“The wonder of it all.” This grabbed me. Could wonder save the humanities? Reading history and literature has taught me that all ideologies ultimately hit and crumble upon the brick wall of reductionism. It has taught me that if ideologies proved completely true, something essential about humanity would be lost. It has taught me that the pursuit of knowledge (scientific or otherwise) has meant different things to different people at different times. Francis Bacon, for one, would never have entertained an ambiguitycrushing version of existence of the kind espoused by today’s reductionists. So why should we? Not to be glib, but why should there be only one version of scientific truth today? Could it be that Koch and his cohort have hijacked and linearized the entire idea of science itself? Has the desire for reductionism reduced science to a single paradigm that marginalizes any phenomenon— irony, for instance—that resists the explanatory powers of brain mapping? And might it not be the very job of the humanities to bring back these intangibles to the center of humanity and celebrate them and ask science to broaden its horizons to accommodate mystery? I wondered. And then I reconnected with Sam McNerney. After moving to New York, he got a small loan from his dad—“How many dads would do that?”—moved in with his girlfriend, and started to blog about the mind, the brain, business, and philosophy. He ran out of money. “It sucked,” he said. But he stuck to his ambition: becoming the best thinker and writer he could be. After a lot of online writing, he received a call from a big publishing company asking if he was interested in writing an in-house blog about business books. He was. When I called him in early December, his job with the publishing company had run its course and he was back in his tiny Manhattan apartment, writing freelance. “I finally have time to think!” he said. We discussed an article he was working on about how technology structures 87

THE HEDGEHOG REVIEW / SUMMER 2015

information. “Technology advances,” he said, “and we complain about information overload. But it turns out we’re really good at organizing information. And when we organize information really well, something gets sacrificed.” He wondered, “Is information too organized?” He wondered, “What gets sacrificed?” The answer he suggested is a word I hadn’t heard in a while, but, given all the research I’d been doing for this article, it initiated a kind of convergence: “serendipity.” Serendipity, indeed. Serendipity is so beautifully slippery. It situates itself between Nagel and Koch, empiricism and the humanities, wonder and science. Serendipity, when you get right down to it, is at the beating heart of wonder. It accounts for McNerney’s epiphany, Sander’s love of literature, my own distrust of reducibility, and Koch’s comfort with his own “annihilation.” Serendipity interrupts the linear view of science. McNerney brought up Darwin. Darwin, he explained, would never have read Malthus’s 1798 essay on population growth, and thus would never have developed his theory of natural selection, had he been “searching for birds on Google.” The comparative looseness of information, the unchanneled nature of investigation, joined with Darwin’s innate curiosity, is what led to one of the most unifying explanations of physical existence the world has known. What sparked it all was pure serendipity. In our age of endlessly aggregated information, the ultimate task of the humanities may be to subversively disaggregate in order to preserve that serendipity. After all, a period of confusion inevitably precedes the acquisition of concrete knowledge. It’s a necessary blip of humbling uncertainty that allows for what McNerney describes as “the call and response” between disparate ideas. As long as that gap exists, as long as a flicker of doubt precedes knowledge, there will always be room for humanistic thought—thought that revels in not knowing. As long as that gap exists, we will not be reduced to the moral equivalent of computers. 88

“There’s a big benefit to not knowing the answer to a question for a long time,” McNerney said toward the end of our conversation. “The trick is knowing enough but not too much, not so much that you kill that sense of wonder.”

Endnotes 1

Thomas Nagel, “What’s It Like to Be a Bat,” Philosophical Review (83) 4 (October 1974), 435.

2

Ibid.

3

Russell Marcus, e-mail message to author, November 28, 2014.

4

Richard Brody, “Thomas Nagel: Thoughts are Real,” The New Yorker online, July 16, 2013; http://www.newyorker. com/books/page-turner/thomas-nagel-thoughts-are-real.

5

“Bachelor’s Degrees in the Humanities,” Humanities Indicators, American Academy of Arts and Sciences; http://www.humanitiesindicators.org/content/indicatordoc.aspx?i=34. Accessed on April 21, 2015.

6

Tamar Lewis, “As Interest Fades in the Humanities, Colleges Worry,” The New York Times, October 30, 2013; http://www.nytimes.com/2013/10/31/education/as-interest-fades-in-the-humanities-colleges-worry.html?_r=0.

7

Sarah Sachs, “Economics sees growing pains as students look for ‘marketable skills’,” The Brown Daily Herald, April 5, 2013; http://www.browndailyherald.com/2013/04/05/ economics-sees-growing-pains-as-students-look-for-marketable-skills/.

8

Anthony P. Carnevale et al., “Not All College Degrees Are Created Equal,” report from Georgetown’s Center for Education and the Workforce, 4; https://www.cgsnet.org/ ckfinder/userfiles/files/Unemployment_Final_update1. pdf; 4. Accessed April 21, 2015.

9

Duke University, Office of News and Communications; http://newsoffice.duke.edu/all-about-duke/quick-factsabout-duke. Accessed April 21, 2015.

10 Susan

Adams, “Majoring in the Humanities Does Lay Off, Just Later,” Forbes, January 22, 2014; http://www. forbes.com/sites/susanadams/2014/01/22/majoring-inthe-humanities-does-pay-off-just-later/.

11 Mark

Bauerlein, “Humanities: Doomed to Lose?” The New Criterion, November 2014; http://www.newcriterion.com/articles.cfm/Humanities--doomed-to-lose--7989.

12 Arthur

Krystal, “The Shrinking World of Ideas,” The Chronicle of Higher Education, November 21, 2014; http://chronicle.com/article/The-Shrinking-World-ofIdeas/150141/.

13 William

Deresiewicz, Excellent Sheep: The Miseducation of the American Elite and the Way to a Meaningful Life (New York: Free Press, 2014), 111.

O N T H E VA L U E O F N O T K N OW I N G E V E RY T H I N G / M CW I L L I A M S

14 Ibid.,

19 Christof

13.

15 Logan

Sander, telephone interview with author, November 25, 2014.

16 Leslie Jamison, “Instead of Sobbing, You Write Sentences:

An Interview with Charles D’Ambrosio,” The New Yorker online, November 26, 2014; http://www.newyorker. com/books/page-turner/instead-sobbing-write-sentencesinterview-charles-dambrosio.

17 Christof

Koch, Consciousness: Confessions of a Romantic Reductionist (Cambridge, MA: MIT Press, 2012), 42.

18 Ibid.,

27.

Koch, “A Theory of Consciousness,” Scientific American Mind; 19: http://www.klab.caltech.edu/koch/ CR/CR-Complexity-09.pdf. Accessed April 21, 2015.

20 Koch,

Consciousness, 152.

21 Lorraine

Daston, “Wonder and the Ends of Antiquity,” The Point; http://thepointmag.com/category/examinedlife. Accessed on April 22, 2015.

22 Ibid. 23 Christof

Koch personal website; http://www.klab.caltech. edu/koch/passions.html. Accessed April 21, 2015.

24 Koch,

Consciousness, 161.

Gaps in your collection? Order back issues of The Hedgehog Review. Too much informaTion

Sarah E. Igo the beginnings of the end of privacy

EuropE in SEarch of EuropEanS

Frank Pasquale the algorithmic self

Philippe Bénéton on a union without politics

Elizabeth Stoker Bruenig why we confess

Zygmunt Bauman on the need for a new kind of state

Julia Ticona and Chad Wellmon uneasy in digital Zion

Christian Joppke on the question of Europe's Christian roots

Siva Vaidhyanathan the rise of the cryptopticon WWW.i as c-c ulture .org

Montserrat Guibernau on the challenge of the new populist movements

WWW.h edgeh ogr evieW.com

thinking about the poor

too much information spring 2015 minding ouR mindS

WWW. i aS c- c u lt u r E.o r g

Alice O’Connor Poverty and paradox

Matthew Crawford how we lost our attention

Michael and Ines Jindra Inside the safety net

Mark Edmundson the purpose of a focused mind

William McPherson Falling John Marsh Continental divide

Thomas Pfau toward an art and ethics of attention

Mike Rose Seeing the invisible poor WWW.iasc-culture.org

Malcolm McCullough regulating the information environment

WWW.hedgehogrevieW.com

thinking about the poor fall 2014

WWW. h E d g Eh o g r Ev i E W.co m

europe in search of europeans spring 2014

Summer 2014

www. i aS c-c ulTuR e .oR g

www.hedgehogR eview.com

minding our minds summer 2014

order by phone

434-243-8935

order by email

[email protected]

order online

www.hedgehogreview.com Go to Issues and select from available back issues.

89

90

The Great Subversion: The Scandalous Origins of Human Rights Ronald Osborn

WHEN BRITISH COMEDIAN STEPHEN FRY

declared in a January 2015 interview on Irish television that if God exists, he is “utterly evil, capricious, and monstrous,” his remarks drew headline attention in newspapers and nearly four million views on YouTube within less than a week of the video’s posting.1 Fry was repeating an argument with a very long history, extending back through David Hume to the Epicureans of ancient Greece and Rome (at least according to the Christian apologist Lactantius, writing in the fourth century).2 He was also echoing sentiments that may be found in one form or another in any number of recent books and articles, both scholarly and popular, whose authors declare that religious beliefs are at best unnecessary and at worst antithetical to humanistic values, human rights, or even morality in general. In a 2011 article in the New York Times titled “The Sacred and the Humane,” for example, Israeli philosopher and human rights activist Anat Biletzki wrote, “There is no philosophically robust

reason to accept the claim that human dignity originates with God.” 3 If anything, Biletzki argued, belief in God is a threat to humanistic values and to concepts of human dignity. Religion should not even be admitted “as a legitimate player in the human rights game,” she wrote, since those concerned with defending rights out of a sense of religious duty are not concerned with rights but only with a kind of slavish obedience to the arbitrary commands of the deity. Other non-religious thinkers, however, have called into question the philosophical coherence and long-term viability of secular humanism and accompanying rights ideals in the wake of the “death of God.” According to British political scientist Stephen Hopgood, “The ground of human rights is crumbling beneath us,” both in theory and in practice: “The world in which global rules were assumed to be secular, universal and nonnegotiable rested on the presumption of a deep worldwide consensus about human rights—but this consensus is illusory.”4 What is

Ronald Osborn is an Andrew W. Mellon Fellow in the Peace and Justice Studies Program at Wellesley College and the author of Death before the Fall: Biblical Literalism and the Problem of Animal Suffering (2014) and Anarchy and Apocalypse: Essays on Faith, Violence, and Theodicy (2010). Left: The Rape of the Sabine Women, 1962, Pablo Picasso (1881–1973); Musée National d’Art Moderne, Centre Pompidou, Paris, France/Bridgeman Images. © 2015 Estate of Pablo Picasso/Artists Rights Society (ARS), New York.

91

THE HEDGEHOG REVIEW / SUMMER 2015

more, Hopgood argues in The Endtimes of Human Rights, notions of inviolable human dignity, rights, and equality as universal norms must now be unmasked as a historically contingent and metaphysically dubious inheritance of Christianity: It is only as a strategy for coping with what Nietzsche called “the death of God” in the West that we can begin to understand the real social function of humanitarianism and human rights in the twentieth century…. [The International Committee of the Red Cross] was, I argue, the first international human rights organization. It was a secular church of the international. The laws it wrote and the humanitarian activism it undertook were grounded by a culture of transcendent moral sentiment with strong Christian components. At the heart of this was the suffering innocent, a secular version of Christ. In other words, bourgeois Europeans responded to the erosion of religious authority by creating authority of their own from the cultural resources that lay scattered around them. And they then globalized it via the infrastructure that the imperial civilizing project bequeathed to them.5 Hopgood’s bracing critique of rights talk and his call for a less lofty, more pragmatic dispensation forces us to face the implications of the loss of theological anthropology for concepts of human equality and dignity. Can we have a rationally coherent, morally compelling, and historically sustainable discourse as well as a practice of humanistic values and human rights absent a “thick” metaphysical or religious framework, such as the one provided in the Western tradition for some two millennia by Judeo-Christian sources? Put another way, the question “Can we be good without God?” does not strike nearly deep enough. The urgent question is: Will we still be 92

good to the stranger in our midst, or good in the same ways, once we have fully grasped the contestable character of humanism and once we have utterly abandoned the essentially religious idea that every person is made, in the enigmatic language of Scripture, in the image of God ? It is a question that even committed atheists, for the sake of good atheism, should find worthy of consideration.

Doctrines of Inequality Answering this question requires that secular humanists attend more closely to the scandalous particularity of the story of the God made visible as a manual laborer from a defeated backwater of the Roman Empire, who was tortured to death by the political and religious authorities of his day on charges of sedition and heresy. We can imagine other religious narratives that could have provided an equally powerful vision and inspiration for humanistic values, but it was this narrative that actually did provide the moral and intellectual foundation for the rise of humanism, and finally liberalism, in the Western tradition.

In classical antiquity, dignity was an acquired rather than inherent trait. Some persons were always deemed more fully human than others.

In classical antiquity, dignity was an acquired rather than inherent trait. Some persons were always deemed more fully human than others.6 Infants born with mental or physical defects, Plato and Aristotle both declared, have no right to share in the life of the community and indeed have no right to life at all. In The Politics, Aristotle writes, “let there be a law that no deformed child shall live.”7 In Plato’s Republic, Socrates says that those

T H E G R E AT S U B V E R S I O N / O S B O R N

“born deformed, [the Guardians] will hide away in an unspeakable and unseen place, as is seemly.” He goes on to encourage free sexual intercourse among adolescents on one condition: that they not “let even a single foetus see the light of day,” and, “if one should be conceived, and, if one should force its way,” that they “deal with it on the understanding that there’s to be no rearing for such a child.”8 In both Greek and Roman thought, slaves, women, and children possessed less dignity than free males, while philosophers capable of attaining heights of speculative philosophy possessed more dignitas—prestige, status, or worthiness—than those who labored with their hands.

The assumption of a rank-ordering or natural hierarchy of human types, with only a few individuals possessing true dignity and so full social standing, may actually represent the most nearly universal political morality that we can identify.

Similar ideas about human inequality pervaded (and continue to pervade) non-Western belief systems. The caste system of Hinduism and classical Buddhist doctrines of reincarnation (according to which the less fortunate or “weak” members of society—the poor, the physically handicapped, and women in general—are born into “lowliness” as a punishment for sins in previous lives) run directly counter to concepts of inviolable dignity and shared human rights. The assumption of a rank-ordering or natural hierarchy of human types, with only a few individuals possessing true dignity and so full social standing, may actually represent the most nearly universal political morality that we can identify. These classical beliefs in the natural inequality of persons did not give way to the idea of shared human dignity and equality as a result of detached philosophical reasoning. Rather, they

were radically subverted by the theological account of personhood unfolded in the Hebrew Bible and culminating in the Christian narrative of the life, death, and resurrection of Christ—the climax of the Jewish prophetic tradition with its radical insistence that the Creator God of the universe stands with the weak, the suffering, and the lowly, judging rulers and nations according to whether they have acted justly toward widows, strangers, and orphans.

Sex, Lies, and Conquest To grasp what Christianity opposed, and what it historically overcame, we might consider a seemingly trivial detail of life during the Pax Romana: coins on which defeated nations were depicted as violated women being trampled underfoot by deified emperors or Roman gods. To comprehend the deeper meaning of these symbols of imperial consciousness, we must recall the foundational myth of the city of Rome to which they alluded. Central to the legend of the founding of Rome by Romulus is “The Rape of the Sabine Women,” a story whose theme is celebrated in Roman art and literature. As told by Livy in his History of Rome, written about thirty years before the birth of Christ, the tale begins with Romulus offering asylum to male refugees from other nations, who quickly swell the city’s population and transform Rome into a “match for any of the neighboring states in war.”9 The sudden increase in the number of males of fighting age leads, however, to a pressing dilemma: There are not enough women to repopulate the city. Romulus sends ambassadors to neighboring states asking them to give their daughters as brides to the Romans, but this request is met with refusals, and, as a result, tensions rise. “The Roman youths were bitterly indignant at this, and the matter began unmistakably to point to open violence.”10 Romulus, “dissembling his resentment,” according to Livy, nonetheless tricks the young 93

THE HEDGEHOG REVIEW / SUMMER 2015

women of Sabine (one of the states that rebuffed him) into coming to Rome. At a prearranged signal, the Roman men pounce upon the Sabine maidens and carry them off, those of “surpassing beauty” being reserved for “the leading senators.” Romulus attempts to mollify the traumatized women by assuring them that they will “be lawfully wedded, and enjoy a share of all their [Roman] possessions and civil rights, and—a thing dearer than all else to the human race—the society of their common children: only let them calm their angry feelings, and bestow their affections on those on whom fortune had bestowed their bodies.”11 The kidnapped women do not embrace their captors, however, and the Sabine men soon launch a counterattack. After some back-and-forth fighting, the Romans gain the upper hand. Seeing their loved ones on the verge of being slaughtered, the Sabine daughters rush onto the battlefield, pleading that the combat cease, lest they become widows through the deaths of their Roman husbands or orphans through the deaths of their Sabine fathers. Livy relates that the “leaders thereupon came forward to conclude a treaty; and not only concluded a peace, but formed one state out of two…. They united the kingly power, but transferred the entire sovereignty to Rome.”12

Rape was the perhaps painful but ultimately glorious way by which Rome incorporated the Other within its civilized laws and “civil rights.”

This story of the rape of the Sabine women, religious studies scholar Davina Lopez writes, was the paradigmatic model of, and justification for, Roman expansionism. Its purpose as an origins myth was to make imperial violence appear noble and “like the natural order of the world.”13 Rape was the perhaps painful but ultimately glorious way by which Rome incorporated the Other 94

within its civilized laws and “civil rights.” The story was “truly foundational to Roman imperial ideology as it expresses relationships between self and other on an international scale…. Conquest rendered in these terms reflects gendered difference in hierarchy: the impenetrable masculinity inherent in Roman rule is chosen to penetrate the femininity of other lands and peoples.”14

The Shape of the In-Breaking Kingdom In an article in the Boston Review, historian Samuel Moyn writes that neither Jesus nor Paul had “any truly political vision.”15 But John Dominic Crossan, N.T. Wright, Richard Horsley, and a host of other biblical scholars have shown in great detail that the New Testament is in fact intelligible only when read as a highly subversive and politically charged collection of texts against the historical backdrop of Roman imperial conquest and occupation and the crushing social hierarchies of the ancient world that find virtually unanimous support in the canons of Greek and Roman philosophy, religion, and myth. According to the earliest Christian documents, God had not only taken on human flesh but was also incarnated in the person of a poor, provincial laborer in the occupied territories of the Roman Empire. Jesus grew up in Nazareth, a tiny village about four miles from the town of Sepphoris, which was struck by Varus’s legionary troops in 4 BCE. Josephus records another attack, led by Lucius Annius at Gerasa just across the Jordan River, and his account makes apparent the atmosphere of violence and national trauma in which Jesus was raised: [Lucius Annius] put to the sword a thousand of the youth who had not already escaped, made prisoners of women and children, gave his soldiers license to plunder the property, and then set fire to the houses and advanced against the

T H E G R E AT S U B V E R S I O N / O S B O R N

surrounding villages. The able-bodied fled, the feeble perished, and everything left was consigned to the flames.16 We can perhaps now better appreciate the scandalous, as well as dangerously “unpatriotic,” political significance of Christ’s declaration in the Gospel of Matthew—at time of foreign imperial occupation punctuated by periodic massacres, mass crucifixions, and insurgency—that God’s kingdom was breaking into history through his own words and actions, and that the shape of God’s in-breaking kingdom entailed an ethic of radical love of one’s enemies beyond good and evil: You have heard that it was said, “You shall love your neighbor and hate your enemy.” But I say to you, love your enemies and pray for those who persecute you, so that you may be sons of your Father who is in heaven; for He causes His sun to rise on the evil and the good, and sends rain on the righteous and the unrighteous. (Matthew 5:43–48, New American Standard Bible) Lest anyone interpret Christ’s words as a retreat from the burning political matters of his day or as capitulation to Roman imperialism, however, we might ponder the Magnificat, the song of praise by Jesus’s mother, Mary, in the first chapter of the Gospel of Luke, which is presented as a prelude to what her son’s entire life will be about: “He has brought down rulers from their thrones, And has exalted those who were humble. He has filled the hungry with good things; And sent away the rich empty-handed” (Luke 1:52–53). The very word the Christian writers chose for the story of Jesus was in fact an appropriation and subversion of Roman political rhetoric; euangelion, translated as “Gospel” or “good news,” was the word used by the Caesars for their official imperial proclamations. From all we know of Jesus’s words and actions, he set his followers on a collision course with the

dominant pagan social and political structures of their day, which could only be sustained so long as classical ideas about what it means to be human remained undisturbed. In the Gospels, Christ is referred to several times as a tekton and the son of a tekton—literally a “craftsman” or, as tradition would have it, a carpenter. This already tells us much about the revolution underway, for in the Greco-Roman world, to be a laborer was to be inferior. Christ’s public career was marked by his ministry to the most marginalized and untouchable members of society, whom he sought to restore to physical wholeness and fullness of community. Prominent among these were women, including one about to be stoned to death by religious zealots for alleged adultery (John 7:53–8:11) and one who had been suffering from a bleeding illness for twelve years, whom, according to Jewish law, no one could touch without becoming defiled (Mark 5:25–34). Jesus’s life ended in his torture and execution at the hands of those religious and political authorities possessing the most dignitas. The method of execution was an emphatically political one, crucifixion typically being reserved for the most serious crimes against the Roman state.17 What is more, the writers of the Gospels of Matthew and Mark both assert that in his final agony, Christ was abandoned by God himself. The cry of dereliction from the center cross is the cry of one who has been not only humanly but even cosmically betrayed: “My God, my God, why have you forsaken me?” (Matthew 27:46, Mark 15:34) Because Christ bids those who would follow him to take up his cross and share in his sufferings, one can only be a disciple if one has also, paradoxically, experienced the death of God. Yet for Christ’s followers, the spectacle of Jesus’s agony and humiliation—the extreme depths of his identification with the sufferings of humanity, and even with its loss of faith or hope—had ironically unmasked the “principalities and powers” once and for all, stripping them of their sacral authority and revealing them as unjust and oppressive forces. 95

THE HEDGEHOG REVIEW / SUMMER 2015

Followers of the risen Christ were to courageously emulate his example of self-emptying service and reconciling enemy love, even to the point of their own deaths, if necessary, for the sake of others. The political implications of the claim that the Godforsaken God has elevated the weak and lowly to a status of equality and high dignity as adopted sons and daughters through his incarnation, suffering, death, and resurrection, are evident in Paul’s revolutionary words from Galatians 3:28: “There is neither Jew nor Greek, there is neither slave nor free man, there is neither male nor female; for you are all one in Christ Jesus.” In a world in which the exposure of newborn infants to the depredations of wild animals and mass executions for public entertainment were regular spectacles, in which slaves—whom Aristotle refers to as “living tools”—were defined by law as non habens personam (“not having a persona,” or even “not having a face”), and in which a polymorphous polytheism led not to liberal toleration of difference, as some have claimed, but to frequently unrestrained violence against anyone who challenged the gods of the family hearth, the tribe, and the empire, the Christian euangelion could only arrive, David Bentley Hart writes, as a “cosmic sedition.”18 Christianity not only offended the patrician sensibilities of Roman aristocrats, as it would Nietzsche, by its undignified concern for the weak and lowly; it also threatened the entire social and political order of pagan antiquity by dramatically redefining what it meant to be human. “What for us is the quiet, persistent, perennial rebuke of conscience within us was, for ancient peoples, an outlandish decree issuing from a realm outside any world they could conceive.”19

Discovering Dignitas Even if the language of “rights” was not explicitly or formally used, the New Testament invested every person with a previously unimaginable worth. Instead of struggling to attain dignitas as 96

a scarce commodity in competitive rivalry with others, all persons were now summoned to live in generous solidarity with their neighbors as persons of dignity and worth equal to their own. Dignity, in the Christian revaluation of values, could not be earned, because it was bestowed as a gift from God, although the gift could be lost or squandered precisely by transgressing the dignity of the Other, whether through violence or by indifference to the Other’s welfare—by denying that that person too was the privileged bearer of the divine image, the divine image now being of a man broken, tortured, and executed by the state. One of the most potent expressions of the Christian invention (if not discovery) of human equality was the way the early believers gathered together for table fellowships without regard for social standing. In the rigidly stratified world of ancient Greece and Rome, in which one’s status determined with whom one could and could not break bread, Christians transgressed all decorum and standards of decency in their common meals or communions. Whereas the model for the incorporation of foreign bodies into the Roman body politic was paradigmatically set by the myth of the rape of the Sabine women, incorporation of new believers into the body of Christ was patterned upon the story of Christ’s last supper— the memory of how Jesus washed the feet of his disciples, the task of a slave, and generously gave of his own body, symbolized by broken bread and wine, so that others might live with abundance. The new faith proved especially attractive to women, sociologist Rodney Stark has shown from a wide array of textual and archaeological sources. By all accounts, Christianity disproportionately drew in female adherents, whose status and power were significantly enhanced by entry into the Christian subculture.20 Women held positions of high leadership in the fledgling church. They could marry later in life (Roman families often gave away prepubescent daughters in marriage), and they benefited from Christian condemnation of traditional male prerogatives in regard to

T H E G R E AT S U B V E R S I O N / O S B O R N

Saint Thecla and Saint Paul with a book, eleventh century; British Museum/Erich Lessing/Art Resource, NY.

divorce, incest, infidelity, polygamy, and female infanticide.21 Paul’s notorious statements about wives’ “submission” to their husbands must be read in full context if one is to grasp their radically equalizing message of mutual submission and reciprocity patterned upon Christ’s own agape, his selfless love. In Ephesians 5:22–23, Paul writes, “Wives, be subject to your own husbands, as to the Lord. For the husband is the head of the wife, as Christ also is the head of the church, he Himself being the Savior of the body.” Yet these verses are part of an extended discourse on marital relations in which Paul commands husbands and wives to “be subject to one another” in reverence of Christ (Ephesians 5:21). He goes on to instruct men, “Husbands, love your wives, just as Christ also loved the church and gave Himself up for her…husbands ought also to love their own wives

as their own bodies. He who loves his own wife loves himself; for no one ever hated his own flesh, but nourishes and cherishes it, just as Christ also does the church, because we are members of His body…each individual among you also is to love his own wife even as himself, and the wife must see to it that she respects her husband” (Ephesians 5:25–33). However problematic these statements might sound to readers today, it is important to judge their emancipatory force in the social context of Paul’s day rather than our own. It was in fact a common slur against Christianity that it was a religion for women. Insofar as women in the ancient world very often had their dignity violated by powerful men, the slur was entirely accurate. Paul’s letters do not include any explicit condemnations of slavery, although in one of his letters of ad hoc pastoral counsel he urges a 97

THE HEDGEHOG REVIEW / SUMMER 2015

Christian slave owner, Philemon, to receive back into his household a runaway slave, Onesimus, in order to be reconciled to him. Some readers have concluded that on the question of slavery Paul therefore endorsed the status quo. But Paul’s response was deeply subversive of the practice in other ways.22 In his letter to Philemon, he redefines the relationship between master and slave in a way that rules out the Aristotelian view of “natural” subjugation and inequality. Because Philemon is now a Christian, Paul writes, he must view Onesimus “no longer as a slave, but more than a slave, a beloved brother” (Philemon 1:16). (Compare Aristotle’s Politics: “For that some should rule and others be ruled is a thing not only necessary, but expedient; from the hour of their birth, some are marked out for subjection, others for rule.”)23 Deeply ingrained beliefs in human inequality did not go without a fight; nor did Christians cease being people of their time. Evidence of this may be found within the biblical text itself, which frequently lays bare the shortcomings of the early believers. Paul chastises wealthy believers in Corinth, for example, for excluding the poor and uneducated from their common meals. He could not force the churches he had planted to change their ways, but he could appeal to their memories of the Jesus story and to the witness of his own life as a model worthy of emulation by those of high social status, effectively reversing the meanings of “high” and “low” so as to render them meaningless: We are fools for Christ’s sake…we are weak, but you are strong; you are distinguished, but we are without honor. To this present hour we are both hungry and thirsty, and are poorly clothed, and are roughly treated, and are homeless; and we toil, working with our own hands; when we are reviled, we bless; when we are persecuted, we endure; when we are slandered, we try to conciliate; we have 98

become as the scum of the world, the dregs of all things, even until now…. Therefore I exhort you, be imitators of me (1 Corinthians 4:10–13, 16).

A Tragic Double Subversion The story of the Christian subversion of pagan values would over time become the story of a tragic double subversion. The retrenchment of hierarchy and domination within the church— particularly after Constantine made Christianity the religion of the empire in the fourth century, reversing several centuries of persecution of believers—means that Christianity is today vulnerable to the charge of being a net force for inequality, hierarchy, violence, and oppression. Yet such an indictment of Christianity can be made, ironically, in large part only because of the very moral and humanistic categories introduced into the West by Christianity itself. The Christian proclamation of the full moral equality of all persons—revealed not by nature or science but through the Imago Dei and the Incarnation of Christ—led gradually but inexorably to a dramatic overturning of the hierarchical values of the ancient world.24 The early churches and later monastic orders modeled ideals of selfregulation, nonviolence, charity, freedom of discussion, separation of spiritual from temporal power, solidarity with the poor, and limited government in imperfect but unprecedented ways. With the spread of Christian moral intuitions, the concept of community was decoupled from tribal or ethnic bloodlines as well as from “natural” hierarchies and was redefined as a voluntary association of individuals of all classes and ethnicities. The highest models of heroism were no longer warriors who conquered and subjugated their rivals, but Christian martyrs—both men and women, often of lowly origin—who displayed a form of courage-in-weakness that was democratically open to all. With the increasing penetration

T H E G R E AT S U B V E R S I O N / O S B O R N

of the Roman state by believers, the rhetoric of leadership also changed. Members of the urban elite who aspired to high office were increasingly compelled to speak (whether sincerely or pragmatically) not of their own nobility, but, rather, of their great “love of the poor.”25 Authority in the emerging Christian “social imaginary,” to use Charles Taylor’s phrase, was likewise relativized in decidedly moral terms, not as dominion but as stewardship. Rulers would now be held to account by clergy and ordinary people on the basis of the subversive ideal of “slave morality”: servanthood. To be a true “lord,” following the example of Lord Jesus, was, paradoxically, to be a humble servant—indeed, a “slave”—of all.

In his letter to Philemon, Paul redefines the relationship between master and slave in a way that rules out the Aristotelian view of “natural” subjugation and inequality.

Although it would take considerable time for these ideas to permeate European culture to the point that they would come to be regarded as virtually self-evident truths, there is an undeniable link between the story-shaped life of the early Christian communities and the lawshaped life of later Western civilization. The idea of natural rights was inscribed in canon law by medieval Christian thinkers as early as the twelfth century.26 Principles of religious toleration and liberty of conscience often credited to Enlightenment thinkers like Locke and Voltaire were already well established in the writings of believers such as Erasmus, Sebastian Castellio, Roger Williams, and the radical reformers in the Anabaptist tradition.27 Legal scholar John Witte writes that the Enlightenment “was not so much a well-spring of Western rights as a watershed in a long stream of rights thinking that began nearly two millennia

before.” This is not to deny or minimize the contributions of Enlightenment thinkers to the idea of rights, Witte asserts; rather, what these later individuals “contributed more than anything were new theoretical frameworks that eventually widened these traditional rights formulations into a set of universal claims that were universally applicable to all.”28 The religious studies scholar Bruce K. Ward argues that in place of the story that has come to dominate much of the academy as well as popular culture, of how the invention of the secular saved the West from the violence of religion, we should speak in terms of violent forms of religion being challenged by nonviolent ones, with the latter ultimately giving rise to liberal values and legal formulations.29 There is nothing in this admittedly outrageously simplified brush-stroke history, of course, that amounts to proof for the metaphysical truth claims of Christianity. One might freely acknowledge the centrality of Christian beliefs to the historical and philosophical rise of concepts of human equality and the overturning of ancient hierarchies while asserting that these beliefs are at best noble fictions and that the values could just as easily have been arrived at by some purely secular path (such that humanism can now float free of its historical past and become, in the words of Thomas Nagel, a “view from nowhere”).30 Alternatively, we might join Nietzsche and his postmodern heirs in rejecting liberal and humanistic values as masks for resentment and power on the logically consistent grounds that the death of God must also lead to the death of the image of God in the Other—and all that went with it. I will not attempt to answer the Nietzschean or postmodern challenges to humanism here except to say that Nietzsche was right: Christianity is slave morality—unapologetically and transparently so. Unlike Nietzsche, however, I take this on balance to be cause for celebration. If secular humanists and atheists committed to liberal values cannot believe in theism, they might still find good reasons to be grateful for it. 99

THE HEDGEHOG REVIEW / SUMMER 2015

Endnotes

17 See, for example, Martin Hengel’s classic study of the prac-

1

“Stephen Fry on God: The Meaning of Life: RTE One” [video], January 28, 2015; https://www.youtube.com/ watch?v=-suvkwNYSQo.

18 David

2

In his Dialogues Concerning Natural Religion, Hume, through the voice of Philo, declares, “Epicurus’ old questions are yet unanswered. Is [God] willing to prevent evil, but not able? then is he impotent. Is he able, but not willing? then is he malevolent. Is he both able and willing? whence then is evil?” David Hume, Dialogues concerning Natural Religion, second edition, ed. Richard H. Popkin (Indianapolis, IN: Hackett, 1998), 63; Tim O’Keefe, Epicureanism (London: Routledge, 2010), 47.

tice, Crucifixion (Philadelphia: Fortress Press, 1977), 46.

Bentley Hart, Atheist Delusions: The Christian Revolution and Its Fashionable Enemies (New Haven, CT: Yale University Press, 2009), 124.

19 Ibid.,

169.

20 Rodney

Stark, The Rise of Christianity: How the Obscure, Marginal Jesus Movement Became the Dominant Religious Force in the Western World in a Few Centuries (Princeton: Princeton University Press, 1996).

21 Stark

quotes from a letter dating from 1 BCE written by a seemingly devoted husband, Hilarian, to his wife, Alis, to illustrate the pagan world’s casual disregard of female infants: “I ask and beg you to take good care of our baby son, and as soon as I receive payment I shall send it up to you. If you are delivered of a child, if it is a boy keep it, if a girl discard it. You have sent me word, ‘Don’t forget me.’ How can I forget you. I beg you not to worry.” Ibid., 97−98.

3

Anat Biletzki, “The Sacred and the Humane,” New York Times, July 17, 2011; http://opinionator.blogs.nytimes.com/ the-sacred-and-the-humane.

4

Stephen Hopgood, “The End of Human Rights,” Washington Post, January 3, 2014; http://www. washingtonpost.com/opinions/the-end-of-humanrights/2014/01/03/7f8fa83c-6742-11e3-ae5622de072140a2_story.html.

22 See Neil Elliott, Liberating Paul: The Justice of God and the

Stephen Hopgood, The Endtimes of Human Rights (Ithaca, NY: Cornell University Press, 2013), x.

23 Aristotle,

5 6

Darrel W. Amundsen, “Medicine and the Birth Defects of Children: Approaches of the Ancient World,” in On Moral Medicine: Theological Perspectives in Medical Ethics, eds. Stephen E. Lammers and Allen Verhey (Grand Rapids, MI: Eerdmans, 1987), 681–92.

7

Aristotle, The Politics, ed. Stephen Everson (Cambridge, England: Cambridge University Press, 1988), VII. 1335b.

8

Plato, The Republic, trans. Allan Bloom (New York: Basic Books, 1968), V. 460c–461c.

9

Titus Livy, Roman History, trans. John Henry Freese, Alfred John Church, and William Jackson Brodribb (New York: D. Appleton, 1898), 11.

10 Ibid.,

11.

11 Ibid.,

11−12.

12 Ibid.,

15.

13 Davina

Lopez, Apostle to the Conquered: Reimagining Paul’s Mission (Minneapolis: Fortress Press, 2008), 70.

14 Ibid.,

70−71.

15 Samuel

Moyn, “Did Christianity Create Liberalism?,” Boston Review, February 9, 2015; https://bostonreview. net/books-ideas/samuel-moyn-larry-siedentop-christianity-liberalism-history.

16 Josephus,

The Jewish War, Books III−IV, vol. 2), trans. H. St. J. Thackeray (Cambridge, MA: Harvard University Press, 1997), 301; see also John Dominic Crossan, God and Empire: Jesus against Rome, Then and Now (New York: HarperCollins, 2007), 110.

100

Politics of the Apostle (Minneapolis: Fortress Press, 2006), 40. The Politics, I. 1254a.

24 I

am especially indebted in this paragraph to Part II (“A Moral Revolution”) in Larry Siedentop, Inventing the Individual: The Origins of Western Liberalism (Cambridge, MA: Belknap Press, 2014), 51–113.

25 Ibid.,

82.

26 See

Nicholas Wolterstorff, “Modern Protestant Developments in Human Rights,” in Christianity and Human Rights: An Introduction, eds. John Witte Jr. and Frank S. Alexander (Cambridge, England: Cambridge University Press, 2010), 155; http://www.cambridge.org/ US/academic/subjects/religion/religious-ethics/christianity-and-human-rights-introduction.

27 See

Perez Zagorin, How the Idea of Religious Toleration Came to the West (Princeton: Princeton University Press, 2003).

28 John Witte Jr., “Introduction,” in Christianity and Human

Rights, 40.

29 Bruce K. Ward, Redeeming the Enlightenment: Christianity

and Liberal Virtues (Grand Rapids, MI: Eerdmans, 2010), 122.

30 Thomas

Nagel, The View from Nowhere (Oxford: Oxford University Press, 1986).

Praise for the latest New Atlantis Books volume “A thoughtful warning about ‘transhumanists’ who aspire to make man immortal.” —World magazine

Eclipse of Man

Human Extinction and the Meaning of Progress “Nano-utopia … the redesign of the body … the biochemistry of bliss … the immortality of an uploaded mind … the coming Singularity. It’s tempting to dismiss transhumanism as wacky. Charles T. Rubin shows why we should take seriously this most radical aspiration, and with clarity and beauty, defends the good of being human.” —Diana Schaub, Loyola University Maryland “Rubin identifies a disquieting tendency among technologically minded idealists to regard not the human condition but humanity itself as the problem.” —Financial Times “More than a decade ago, Charles T. Rubin pointed out that the utopian dreams of perfecting humanity amounted to nothing less than an ‘extinctionist project.’ In ECLIPSE OF MAN, he explores some of the confusions and contradictions inherent to transhumanism, thereby helping us to understand and appreciate better what it means to be human.” —Yuval Levin, Author of The Great Debate

For details, visit www.NewAtlantisBooks.com

101

The Common Core and Democratic Education Johann N. Neem DAVID COLEMAN, A FORMER MCKINSEY &

Company consultant and the current president of the College Board, is one of the key figures behind the recent Common Core State Standards initiative. He has been described as “an utterly romantic believer in the power of the traditional liberal arts,” and Time magazine named him one

of the 100 most influential people of 2013. He is also a former Rhodes Scholar “whose conversation,” Dana Goldstein wrote in The Atlantic, “leaps gracefully from Plato to Henry V,” and who has “advanced degrees in English literature from Oxford and classical philosophy from Cambridge.”1

Johann N. Neem, professor of history at Western Washington University, is a visiting faculty fellow at the Institute for Advanced Studies in Culture at the University of Virginia. He is author of Creating a Nation of Joiners (Harvard University Press, 2008). Above: Illustration by Fanatic Studio/Getty Images.

102

T H E C O M M O N C O R E A N D D E M O C R AT I C E D U C AT I O N / N E E M

At least on paper, Coleman is precisely the sort of person you would want in charge of a national standards initiative. But when outlining the kind of education he wants for American children, he sets his sights much lower than you might expect. When asked in 2012, for instance, why he chose to become the College Board’s new president, Coleman responded that he believed that the organization could “help the movement towards agreement that college- and careerreadiness is the goal of K−12 education in this country.”2 That phrase, drawn directly from the Common Core standards, reflects a diminished understanding of democratic education. Coleman’s fellow business leaders have been more explicit. Chris Kershner, a member of the Dayton (Ohio) Area Chamber of Commerce (and a Common Core advocate) put it this way: “The business community is the consumer of the educational product. Students are the educational product. They are going through the education system so that they can be an attractive product for business to consume and hire as a work force in the future.”3 For Kershner, there is little doubt about whose interest public education should serve. We Americans once saw public education as something more than just preparation for the work force; we saw it as a means of preparing citizens and developing human beings. The Common Core signals an absence of one understanding of education, but also the presence of something else. To understand this something else, we must look to recent history.

From Charlottesville to the Culture Wars The national standards movement began in September 1989 at a two-day summit in Charlottesville, Virginia, when President George H.W. Bush and forty-nine state governors agreed that the country needed to establish clear national goals and hold schools accountable to them.4

To that end, President Bush put together the National Education Goals Panel, a body of six governors, four members of the administration, and four members of Congress. The panel concluded that higher academic achievement should prepare students for “citizenship, further learning, and productive employment” through engagement in “challenging subject matter including English, mathematics, science, foreign languages, civics and government, economics, arts, history, and geography,” and by teaching students “to use their minds well.”5 We Americans once saw public education as something more than just preparation for the work force; we saw it as a means of preparing citizens and developing human beings.

The Bush administration urged implementation of national standards and testing in five core areas of study: English, mathematics, science, history, and geography. In 1991, Congress authorized the president to form the National Council on Education Standards and Testing, which in turn offered three reasons to improve education standards: “to promote educational equality, to preserve democracy and enhance the civic culture, and to improve economic competitiveness.”6 To develop national standards, the council called for the creation of a coordinating body composed of public leaders, educators, and members of the public, in keeping with the belief that public education has civic, academic, and economic purposes, as well as multiple stakeholders. President Bush took action, working with the Department of Education, the National Science Foundation, the National Endowment for the Arts, and the National Endowment for the Humanities to bring together teachers and professors who could formulate the standards. 103

THE HEDGEHOG REVIEW / SUMMER 2015

On taking office, President Bill Clinton had every intention of continuing Bush’s work, but then things fell apart. The National History Standards Task Force, set up by Clinton’s predecessor and co-chaired by education professor Charlotte Crabtree and history professor Gary Nash, released a report recommending a focus on “both the nation’s diversity exemplified by race, ethnicity, social and economic status, gender, region, politics, and religion, and the nation’s commonalities,” as well as encouraging students to understand “our common civic identity and shared civic values.”7 The op-ed pages went wild, beginning with former NEH Chair Lynne Cheney’s characterization of the standards as offering “unqualified admiration for people, place, and events that are politically correct” at the cost of “traditional history.”8 Crabtree and Nash released their own response to what they called a “right-wing assault,” arguing that the standards reflected the state of the field and that democracies have an obligation to teach history in a way that includes all people.9 By seeming to embrace multiculturalism, the standards opened another front in the culture wars.

A similar debate took place over the standards for English proposed by the International Reading Association and the National Council of Teachers of English in 1996. The very first one recommended that students “read a wide range of print and nonprint texts to build an understanding of texts, of themselves, and of the cultures of the United States and the world,” in order “to acquire new information, to respond to the needs and demands of society and the workplace, and for personal fulfillment,” but there was little about what might constitute America’s literary tradition. 104

Another of the English standards called for students to “develop an understanding of and respect for diversity in language use, patterns, and dialects across cultures, ethnic groups, geographic regions, and social roles.”10 Conservatives balked at what they perceived as the relativization of English into dialects (although the report recognized that “some varieties of English are more useful than others”).11 By seeming to embrace multiculturalism, the standards opened another front in the culture wars. To conservative critics, such standards proved that most academics and educators could not be entrusted with designing a balanced curriculum. Not surprisingly, those academics and educators thought the same of their critics.12

The Origins of the Common Core Although initial efforts under Bush and Clinton failed, the National Governors Association (NGA) and the Council of Chief State School Officers (CCSSO) continued to seek national education standards. Hoping to avoid a repeat of the 1990s, they turned in the following decade to business leaders, like-minded foundations, and testing companies. In 2008, Gene Wilhoit, the CCSSO executive director, and David Coleman, the future architect of the Common Core, traveled to Seattle to meet Microsoft founder and leading philanthropist Bill Gates and his wife, Melinda. Soon thereafter, the newly created Gates Foundation awarded more than $200 million to the cause, including funds for policy groups, academics, and studies. The movement also won the backing of the man who would become president: Senator Barack Obama.13 Once the NGA and CCSSO decided to move forward, they hired Coleman’s own organization, Student Achievement Partners, to develop the standards. It was clear from the beginning that the standards would be designed in-house, with little input from the academic community.

T H E C O M M O N C O R E A N D D E M O C R AT I C E D U C AT I O N / N E E M

The people appointed to the standards working group were overwhelmingly from the business and testing worlds: Out of ten members, only one was a professor. Four came from the testing organization ACT, one from Student Achievement Partners, one from Pearson–America’s Choice, three from the College Board, three from Achieve (a nonprofit directed by governors and business leaders to promote education reform), and one from the communications firm VockleyLang. While there would be more academic input through feedback groups and validation committees, it was clear where the real power was vested.14 Behind the decision by the NGA and CCSSO to turn to the business world was a new set of ideas about governance, reliant on the insights of what is called “agency theory.”15 In their 1976 article “Theory of the Firm: Managerial Behavior, Agency Costs, and Economic Structure,” economists Michael Jensen and William Meckling argued that in any organization in which ownership is separated from direct control, owners (as principals) must delegate authority to employees (their agents) who have their own competing interests.16 The problem is exacerbated when it comes to skilled professional work, because professionals—teachers, doctors, professors— have traditionally exercised discretion and judgment.17 According to agency theory, principals must devise tools to align agents’ interests with those of their principals, including close monitoring, economic incentives, and market accountability. When applied to school reform, agency theory recommends that teachers’ remuneration be tied to external measures of success (such as test scores) and market accountability bolstered through national scorecards or school choice.18 Nowhere is agency theory’s influence on education reform clearer than in New York City, where, under Mayor Michael Bloomberg and Chancellor Joel Klein, the public schools were granted autonomy in return for achieving specific performance

goals. If schools failed, they would be sanctioned or shut down.19 Under agency theory, professionals are treated as rational actors seeking to maximize their utility by responding to carrots and sticks, but there is no accounting in agency theory for how, absent strong professional communities, the goods they stand for—education, medicine, law, journalism—will be sustained. In short, without the moral and political resources of professionalism, the purpose of public education is at risk of being manipulated by managers to ends foreign to it.20 The people appointed to the standards working group were overwhelmingly from the business and testing worlds.

And that is exactly what is happening. Under the first Bush administration, the civic, human, and economic purposes of public education were front and center in discussions of national standards. Every student deserved access to high-quality subject matter: “Students,” ran one Bush-era report, “sometimes have not been introduced to literature because the focus has been on basic skills.”21 Studying “literature” would “enrich life experiences, increase employability, and enhance communication.” Such study served egalitarian ends. The Common Core got rid of all this, whittling away the democratic and human purposes of education, until all that was left were the basic skills. The opening paragraph of the standards for “English Language Arts & Literacy in History/ Social Studies, Science, and Technical Subjects,” co-written by David Coleman, limits the standards’ goals to “college- and career-readiness.”22 Only in the final paragraph of the introduction to the standards do the authors acknowledge that learning to read and write well has “wide 105

THE HEDGEHOG REVIEW / SUMMER 2015

applicability outside the classroom and work place,” including to the enjoyment of literature, the parsing of data, and the preparation of people for “private deliberation and responsible citizenship in a republic.” These other outcomes are not presented as goals, however, but as the “natural outgrowth” of work force readiness.

Widening the Gap The Common Core’s strength is its emphasis on engaging texts seriously. In one of his more (in)famous comments, Coleman has mocked educators for caring more about what students think than about teaching them to read and write effectively: “As you grow up in this world you realize people really don’t give a shit about what you feel or what you think. What they instead care about is can you make an argument with evidence, is there something verifiable behind what you’re saying or what you think or feel.” But making arguments to what end? When Coleman talks about preparing students for the “world,” it is clear that he means the working world. “It is rare in a working environment,” Coleman continued, “that someone says, ‘Johnson, I need a market analysis by Friday but before that I need a compelling account of your childhood.’”23 One Coleman critic has responded that while employers may not care about the personal and civic lives of workers, in a democracy “citizens have a sincere interest in what other citizens have to say.”24 Empathy matters. Knowledge also matters. The arts and sciences offer students ways to make sense of the world they inhabit. For Coleman, however, these intellectual goals are secondary to career skills. In his essay “Cultivating Wonder,” he begins, “So much depends on a good question.” The right question “invites students into a text or turns them away.” The standards must “come to life,” which can only be made to happen by exploring “specific questions about particular texts.” He aspires to 106

help students “pay attention,” to get under the surface of a work and truly understand it.25 So far, so good. But what do these standards look like in practice? In an online lesson on Martin Luther King Jr.’s “Letter from Birmingham Jail,” Coleman does not allow students access to historical context or any other framing. To him, there is only the text, and nothing should interpose itself between the student and the text. The idea is to enable students to deal seriously with the words before them, to do what he calls “the hard work of reading a text closely, carefully, and well.” Such an approach ought to elevate, even ennoble, texts. But Coleman seems to care little about the impact that a good, close reading might have on students as people and citizens. Reading King is important because it develops, as Coleman puts it at the beginning of his lesson, “a college- and career-ready skill”—not because of King’s insights into the human condition, Christianity, or American history. From the perspective of college- and career-readiness, the content is arbitrary.26 In one of his more (in)famous comments, Coleman mocked educators for caring more about what students think than about teaching them to read and write effectively.

Coleman’s invitation to engage texts is undermined by his presumption that instrumental skills matter more than the particular ends to which they are devoted.27 By ensuring that King’s letter is read in isolation from its historical context or larger conversations, Coleman does not allow students to learn much from King’s message. Far from ennobling the text, Coleman has dismissed what the “Letter from Birmingham Jail” might teach us.

Photograph by Chris Windsor/Getty Images.

T H E C O M M O N C O R E A N D D E M O C R AT I C E D U C AT I O N / N E E M

This approach has similarities with that of the American progressive education movement, whose champions criticized traditional academic subjects for being irrelevant to students—or, as John Dewey put it in his 1916 Democracy and Education, “so much material to be studied.” The curriculum had to be made child centered and relevant to a diverse student body. For some progressive reformers, this meant a more intense focus on vocational education, but for others, including progressive education’s heirs in today’s schools of education, the primary aspiration was to empower students as democratic citizens.28 In her examination of progressive education’s legacy to twentieth-century American education, historian Diane Ravitch criticizes schools’ “flight from content and from knowledge.” By denying students access to the insights of the arts and sciences, Ravitch worries, American schools will

widen the “gap between the educated haves and the poorly schooled have-nots.”29 The same criticism applies to the Common Core.

Evidence of Emptiness The debate between skills and knowledge is older than the Common Core. The Common Core’s approach to reading and writing hearkens back to the Renaissance, when new teachers called “humanists” emphasized literary skills (grammar, rhetoric, and logic). Renaissance humanists believed that the study of ancient texts taught students “to write and speak well,” skills vital for employment in church or government and for preparation for the higher university studies of law, theology, and medicine.30 Well before the Renaissance revived interest in ancient texts, the ancients themselves debated 107

THE HEDGEHOG REVIEW / SUMMER 2015

the relationship between skills and knowledge.31 In Plato’s Gorgias, Socrates argues with Gorgias, a famous sophist and rhetorician. Socrates claims that all speech worthy of its name is shaped by knowledge: Doctors speak about medicine because they know medicine. This, Socrates continues, is equally true for “the other arts and sciences,” which are “concerned with the subject belonging to the particular art or science.” Gorgias, favoring rhetoric, responds, “You don’t have to learn the other arts and sciences, only this one [rhetoric], and you’re on par with the experts.” In turn, Socrates wonders whether one should go to a doctor who understands how to persuade people but does not understand medicine. To Socrates, speech cannot be an independent art because it depends on knowledge.32 In The Ideal Orator, an influential text for Renaissance humanists, the Roman statesman Cicero claims that oratory requires both knowledge and rhetoric. To Cicero, oratory is “something greater, and is a combination of more arts and pursuits, than is generally supposed.” The world has few orators, Cicero avers, because it is impossible to be a good orator “unless [one] has gained a knowledge of all the important subjects and arts.” Without knowledge, Cicero argues, “the orator’s speech will remain an utterly empty, yes, almost childish verbal exercise.” Society needs philosophers (Cicero wrote that he “would prefer inarticulate wisdom to babbling stupidity”), but not everyone must become one. Instead, a liberal education would develop the skills, knowledge, and dispositions or virtues necessary to use philosophy’s insights to inform action in the world. Unlike the pure philosopher or sophist, the ideal orator “unites wisdom and eloquence,” knowledge with skills and virtue.33 That remains a worthy aspiration for the graduates of our public schools, some of whom will become philosophers and scientists, but all of whom are human beings and citizens. 108

By favoring skills over knowledge, the Common Core reduces education to sophistry. Common Core advocates might respond to such criticism by claiming, as Secretary of Education Arne Duncan put it in a 2013 speech in Washington, that the standards “are the goals,” whereas a curriculum “is what teachers teach.”34 This distinction, while useful, is also questionable. Given the Obama administration’s embrace of high-stakes testing, the skills required by the Common Core may well push out other aspects of the curriculum; indeed, there is evidence that this is already happening and that textbook companies are designing new curricula geared to the Common Core. Because test scores appear to be objective, school leaders know that parents use them as a proxy for quality. Schools will teach to the test even if it means, as Mike Rose has written, diminishing “our definition of human development and achievement—that miraculous growth of intelligence, sensibility, and the discovery of the world—to a test score.”35 In our effort to evade the culture wars, we have instead embraced a managerial understanding of education shaped by agency theory and the priorities of business leaders. The Common Core offers students instrumental skills divorced from the purposes for which those skills might be used. In their book Winner-Take-All Politics, political scientists Jacob Hacker and Paul Pierson argue that partisan gridlock helps those with economic power.36 The same may be true for cultural gridlock: It leaves the economic as the only common ground for policymakers to invoke. Agreeing on little, we have reduced our national aspirations to “college- and careerreadiness.” Those words are evidence of a deeper emptiness.

T H E C O M M O N C O R E A N D D E M O C R AT I C E D U C AT I O N / N E E M

12 Diane Ravitch, “Hijacked! How the Standards Movement

Endnotes 1

Dana Goldstein, “The Schoolmaster,” Atlantic Monthly, October 2012. See also Joy Resmovits, “David Coleman, Common Core Writer, Gears Up for SAT Rewrite,” Huffington Post, August 30, 2013; http://www.huffingtonpost.com/2013/08/30/david-coleman-common-coresat_n_3818107.html.

2

Frederick Hess, “Straight Up Conversation: Common Core Architect and New College Board President David Coleman,” Education Next, June 4, 2012; http:// educationnext.org/straight-up-conversation-commoncore-architect-and-new-college-board-president-davidcoleman/.

3

Valerie Strauss, “The Quote That Reveals How At Least One Corporate School Reformer Really Views Students,” Washington Post (online edition), August 27, 2014; http://www.washingtonpost.com/blogs/answer-sheet/ wp/2014/08/27/the-quote-that-reveals-how-at-least-onecorporate-school-reformer-really-views-students/.

4

For most of the narrative that follows, I rely on Maris Vinovskis, The Road to Charlottesville: The 1989 Education Summit (Washington, DC: National Education Goals Panel, 1999), 34; and John F. Jennings, Why National Standards and Tests? Politics and the Quest for Better Schools (Thousand Oaks, CA: Sage Publications, 1998), 94. See also eds. Carl F. Kaestle and Alyssa E. Lodwick, To Educate a Nation: Federal and National Strategies for School Reform (Lawrence, KS: University Press of Kansas, 2007), 92; and Lawrence J. McAndrews, The Era of Education: The Presidents and the Schools, 1965–2001 (Urbana, IL: University of Illinois Press, 2006), 133–66.

5

Jennings, Why National Standards and Tests, 14.

6

National Council on Education Standards and Testing, Raising Standards for American Education (Washington, DC: US Department of Education, 1992), 3.

7

National Center for History in the Schools, National Standards for United States History: Exploring the American Experience: Grades 5–12 (Los Angeles: University of California–Los Angeles, 1994), 3; Andrew Hartman, A War for the Soul of America: A History of the Culture Wars (Chicago: University of Chicago Press, 2015), 275.

8

Lynne Cheney, “The End of History,” Wall Street Journal, October 20, 1994.

9

Gary Nash, Charlotte Crabtree, and Ross E. Dunn, History on Trial: Culture Wars and the Teaching of the Past (New York: Knopf, 1997), Chapter 8. See also Diane Ravitch, Left Back: A Century of Failed School Reforms (New York: Simon & Schuster, 2000), 429–52; Jennings, Why National Standards and Tests, Chapter 7; and James Davison Hunter, Culture Wars: The Struggle to Define America (New York: Basic Books, 1991), Chapter 8.

10 International

Reading Association and National Council of Teachers of English, Standards for the English Language Arts (Newark, DE, and Urbana, IL: National Council of Teachers of English, 1996), 3, 16, 21–22.

11 Ravitch,

Left Back, 437–38.

Turned into the Testing Movement,” The Death and Life of the Great American School System (New York: Basic Books, 2010), 15–30; Ravitch, “Education after the Culture Wars,” Daedalus 131, no. 3 (2002), 5–21.

13 Lindsey

Layton, “How Bill Gates Pulled Off the Swift Common Core Revolution,” Washington Post, June 7, 2014; Valerie Straus, “Gates Gives $150 Million in Grants for Common Core Standards,” Washington Post (online edition), May 12, 2013; http://www.washingtonpost.com/blogs/answer-sheet/wp/2013/05/12/gatesgives-150-million-in-grants-for-common-core-standards/ See also Anthony Cody, The Educator and the Oligarch: A Teacher Challenges the Gates Foundation (New York: Garn Press, 2014), Chapter 14; Ravitch, Death and Life, Chapter 10; and Donald Zancanella and Michael Moore, “The Origins of the Common Core: Untold Stories,” Language Arts 91, no. 4 (2014), 273–79.

14 Information

on the committees can be found at the website of the National Governors Association in a release titled “Common Core State Standards Development Work Group and Feedback Group Announced,” July 1, 2009; http://www.nga.org/cms/home/news-room/newsreleases/page_2009/col2-content/main-content-list/ title_common-core-state-standards-development-workgroup-and-feedback-group-announced.html. See also Mercedes Schneider, A Chronicle of Echoes (Charlotte, NC: Information Age Publishing, 2014), 173–83; Joy Pullman, “Five People Wrote ‘State-Led’ Common Core,” Heartlander Magazine, June 7, 2013; http://news. heartland.org/newspaper-article/2013/06/07/five-peoplewrote-state-led-common-core.

15 Charles

Kerchner, David Menefee-Libey, and Laura Steen Mulfinger, “Comparing the Progressive Model and Contemporary Formative Ideas and Trends,” in The Transformation of Great American School Districts: How Big Cities Are Reshaping Public Education, eds. William Lowe Boyd, Charles Kerchner, and Mark Blyth (Cambridge, MA: Harvard Education Press, 2008), Chapter 1; Rakesh Khurana, From Higher Aims to Hired Hands: The Social Transformation of American Business Schools and the Unfulfilled Promise of Management as a Profession (Princeton: Princeton University Press, 2007), 317–26; Kathleen Eisenhardt, “Agency Theory: An Assessment and Review,” Academy of Management Review 14, no. 1 (1989), 57–74. For context, see also Daniel Rodgers, Age of Fracture (Cambridge, MA: Harvard University Press, 2011), Chapter 2.

16 Michael

Jensen and William Meckling, “Theory of the Firm: Managerial Behavior, Agency Costs, and Economic Structure,” Journal of Financial Economics 3 (1976), 305–60.

17 Malileh

Mansouri and Julia Adair Rowney, “The Dilemmas of Accountability for Professionals: A Challenge for Mainstream Management Theories,” Journal of Business Ethics 123 (2014), 45–56. See also the thoughtful discussion in Joseph Heath, “The Uses and Abuses of Agency Theory,” Business Ethics Quarterly 19, no. 4 (2009), 497–528.

18 Kerchner et al., “Comparing the Progressive Model”; see also

ed. Paul E. Peterson, Choice and Competition in American Education (Lanham, MD: Rowman & Littlefield, 2006), 29.

109

THE HEDGEHOG REVIEW / SUMMER 2015

19 Paul

T. Hill, “Leadership and Governance in New York City School Reform,” in Education Reform in New York City: Ambitious Change in the Nation’s Most Complex School System, eds. Jennifer O’Day, Catherine Bitter, and Louis Gomez (Cambridge, MA: Harvard Education Press, 2011), Chapter 1; Katharine Destler, “Creating a Performance Culture: Incentives, Climate, and Organizational Change,” American Review of Public Administration, published online October 6, 2014, doi:10.1177/0275074014545381.

20 Beryl

Radin, Challenging the Performance Movement: Accountability, Complexity, and Democratic Values (Washington, DC: Georgetown University Press, 2006), Chapter 4; Donald P. Moynihan, The Dynamics of Performance Management: Constructing Information and Reform (Washington, DC: Georgetown University Press, 2008), Chapter 7. See also Jon D. Michaels, “Running Government Like a Business…Then and Now,” Harvard Law Review 128 (Feb. 2015); Robert Locke and J.C. Spender, Confronting Managerialism: How the Business Elite and Their Schools Threw Our Lives Out of Balance (London, England: Zed Books, 2011).

21 National

Council on Education Standards and Testing, Raising Standards, 22–24.

22 Common

Core State Standards Initiative, Common Core State Standards for English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects, 2010; http://www.corestandards.org/wp-content/ uploads/ELA_Standards.pdf.

23 Schneider,

Chronicle of Echoes, 169.

24 Nicholas

Tampio, “David Coleman’s Plan to Ruin Education,” Al-Jazeera America, December 5, 2014; http:// america.aljazeera.com/opinions/2014/12/commoncore-collegeboardeducation.html. Martha Nussbaum, in Not for Profit: Why Democracy Needs the Humanities (Princeton: Princeton University Press, 2010), identifies “empathy” as one of the essential dispositions of democratic citizenship, x.

25 David

Coleman, “Cultivating Wonder,” The College Board, 2013, accessed April 29, 2015; https:// dl.dropboxusercontent.com/u/2901650/Blog%20Docs/ CultivatingWonder.pdf. Coleman’s approach plays into a decades-old debate among scholars of literature about how to read a text. For context, see Terry Eagleton, Literary Theory: An Introduction (Minneapolis: University of Minnesota Press, 2008).

26 David Coleman, “Middle School ELA Curriculum Video:

Close Reading of a Text: ‘Letter from a Birmingham Jail’” (Dec. 2012); https://www.engageny.org/resource/middleschool-ela-curriculum-video-close-reading-of-a-text-mlkletter-from-birmingham-jail.

27 A

similar critique is offered by Paul Deneen, “Common Core and the American Republic,” The American Conservative (online edition), November 20, 2013; http://www.theamericanconservative.com/common-coreand-the-american-republic/.

110

28 John

Dewey, Democracy and Education (New York: Macmillan, 1916), 134. On progressive ideas’ influence in schools of education, see David Labaree, The Trouble with Ed Schools (New Haven, CT: Yale University Press, 2004), Chapter 7.

29 Ravitch,

“Education after the Culture Wars.” For historical discussions of the origins of progressive pedagogy, see Ravitch, Left Back; William J. Reese, “The Origins of Progressive Education,” History of Education Quarterly 41, no. 1 (2001), vi–24; Herbert Kliebard, Struggle for the American Curriculum 1893–1958 (New York: Routledge, 1995).

30 Paul

Oskar Kristeller, Renaissance Thought: The Classic, Scholastic, and Humanistic Strains (New York: Harper, 1961), 13; Anthony Grafton and Lisa Jardine, From Humanism to the Humanities: Education and the Liberal Arts in Sixteenth-Century Europe (Cambridge, MA: Harvard University Press, 1986).

31 My

discussion relies on Bruce Kimball, Orators and Philosophers: A History of the Idea of Liberal Education (New York: Teachers College, Columbia University, 1986); and Francis Oakley, Community of Learning: The American College and the Liberal Arts Tradition (New York: Oxford University Press, 1992), Chapters 1−2.

32 All

quotes from Plato, Gorgias, trans. Tom Griffith; ed. Malcolm Schofield (Cambridge, England: Cambridge University Press, 2010).

33 All quotes from Cicero, On The Ideal Orator, trans. James

May and Jakob Wise (New York: Oxford University Press, 2001).

34 Arne

Duncan, “Duncan Pushes Back on Attacks on Common Core Standards,” June 25, 2013, US Department of Education; http://www.ed.gov/news/ speeches/duncan-pushes-back-attacks-common-corestandards.

35 Mike

Rose, Why School? Reclaiming Education for All of Us (New York: New Press, 2009). There is evidence of declining public support for the Common Core among voters of both parties. See A. Jochim and L. Lavery, “The Evolving Politics of the Common Core: Implementation and Conflict Expansion,” Publius (forthcoming).

36 Jacob

S. Hacker and Paul Pierson, Winner-Take-All Politics: How Washington Made the Rich Richer—and Turned Its Back on the Middle Class (New York: Simon & Schuster, 2011).

111

THE HEDGEHOG REVIEW / SUMMER 2015

BOOK REVIEWS

The Great Accumulation Karl Shuve The Ransom of the Soul: Afterlife and Wealth in Early Western Christianity Peter Brown Cambridge, MA: Harvard University Press, 2015.

When Cardinal Jorge Mario Bergoglio was elected pope in the early evening hours of March 13, 2013, he took a name—Francis—that no previous pontiff had chosen. It was a weighty decision to link his papacy symbolically with Il Poverello, the little poor man of Assisi, whose Order of Friars Minor came to be viewed with suspicion by the medieval Roman Catholic Church for its teaching that Christ and the apostles owned no possessions. But in allowing the themes of humility and poverty to shape his papacy, Pope Francis has endeared himself to the many people, within and outside the Catholic Church, who have difficulty reconciling the church’s immense wealth with the teachings of Christ found in the Gospels. Whether in the spectacular Gothic architecture of Notre-Dame de Paris, the sumptuous surroundings of Vatican City, or the glittering mosaics of San Vitale in Ravenna, one can sense the seeming contradiction between the church’s lavish expenditure of resources and Jesus’s command to the rich young man that if he wishes to obtain eternal life, he must sell all that he owns and “give the money to the poor” (Mark 10:21, New Revised Standard Version). How can such ostentation stand alongside the radical proclamation in Luke 6:20, “Blessed are you who are poor”? Something, it seems to many, has gone remarkably wrong. But why did the Catholic Church come to have so much wealth in the first place, and what have been the social, political, and cultural 112

implications of this great accumulation? These are questions the eminent historian Peter Brown, a professor emeritus at Princeton, has spent much of the last two decades answering, nowhere more directly than in his two most recent books. The Ransom of the Soul, an expanded version of lectures he delivered in 2012, follows on the argument developed in his magisterial 792-page tome Through the Eye of a Needle, which was published earlier that same year by Princeton University Press. Although Brown’s scholarship speaks with profundity and insight to the apparent contradiction within the Catholic Church and Christianity more broadly, his intellectual energy is focused more directly on a problem of a different kind: What was the relationship between the rise of Christianity and the fall of Rome? In particular, Brown challenges one of the central claims of eighteenth-century historian Edward Gibbon, who in the first volume of The History of the Decline and Fall of the Roman Empire laid much of the blame for Rome’s flagging vigor on the influence of Christianity. Among other ills, Gibbon charged, the Christian church had been responsible for causing “a large portion of public and private wealth” to be “consecrated to the specious demands of charity and devotion” (I.39). From the beginning of his career, Brown has resisted narratives of decline, favoring instead the language of transformation. In his seminal The World of Late Antiquity (1971), he framed the centuries before and after the cessation of Roman rule in the West as a time of great religious and philosophical innovation, which witnessed the rise of the Christian church, rabbinic Judaism, Neoplatonism, and Islam. In the years between that book and his last two, Brown helped to reshape the ways in which students of the ancient world think about the development of

Christianity. Always trying to discover the deeper streams of change that run beneath great political events, he has consistently resisted notions of religious decline as much as he has challenged facile accounts of social and political decline. Practices and beliefs that earlier generations of scholars dismissed as corrosive superstition, representing a descent from enlightened antiquity to Dark Age barbarism, now appear, thanks to Brown, in a wholly new light. The problem of the accumulation of wealth that has gripped Brown so thoroughly over the last four decades of his career arguably stands at the nexus of two types of comprehensive change: political and religious. The church’s growing wealth can be viewed both as the cause of the fall of Rome in the West and as a sign of the “greed” that allegedly came to beset the Christian church. Through the Eye of a Needle, which has been amply discussed in other reviews, obliterates Gibbon’s accusation that the flow of wealth into the church’s coffers sapped the Roman state of its treasure and strength. A long and demanding read, even for the specialist, the book shows precisely how Christian bishops from the late fourth century through the sixth transformed Roman civic benefaction into an ideal of care for the poor. The Ransom of the Soul takes up the problem from a different perspective by examining the way in which wealth served as a conduit between the present world and the hoped-for hereafter. Elegantly written and eminently succinct, the book preserves in large measure the voice of the original spoken delivery. It almost has the feel of a travelogue, with Brown guiding the reader from the fertile plains of third-century North Africa to the harsh forests of eighth-century Germany. The narrative reflects Brown’s long-standing insistence that the first millennium of the Christian era was a time of both continuity and change. He opens with several Gospel texts that promote the view that wealth is not so much divested as “transferred” from this world to the next. Jesus

Alfredo Dagli Orti/The Art Archive at Art Resource, NY.

BOOK REVIEWS

Mosaic of the widow’s mite, Ravenna, sixth century.

is quoted as telling the rich young man to give to the poor so that he might “have treasure in heaven” (Matthew 10:21). As a way of demonstrating how indebted medieval Christians were to this conception of religious giving, Brown juxtaposes Jesus’s words to the rich young man with the story of a sixth-century Roman cobbler, who each week gave alms to the poor at the shrine of Saint Peter. Each time the cobbler performed this act of charity, a pious man in Rome received a vision of a brick being added to a glorious heavenly mansion. The cobbler was, quite literally, storing up treasure for himself in heaven by divesting himself of it on earth. Where other historians may imagine only a vast chasm between the first and sixth centuries, Brown reveals continuities and connections, in this case by showing how the recorded words of Jesus gave a distinctive shape to the medieval notion of giving. But, of course, there can be no historical narrative without change, and this is what most interests Brown. And to locate its sources, he focuses primarily on changing views of the soul’s journey after death and the shape of the cosmos. If we are tempted to believe that the church followed the money, Brown has it the other way around: Attitudes toward wealth and giving changed in response to new beliefs about sin and 113

THE HEDGEHOG REVIEW / SUMMER 2015

the afterlife. Characteristically for Brown, these changes are not explained away by major political events, such as the accession of Constantine in 306 and the deposition of the last emperor, Romulus Augustulus, in 476. Far more determinative of the way the church thought about and used money was a slow rethinking of how bonds were formed between the living and the dead, and what those in one realm could do for those in the other. Brown traces several broad movements. The first is a change from concern with the afterlives of martyrs to concern with those of ordinary Christians who neither lived nor died heroically. In the writings of the earliest Latin Christians, it was imagined that martyrs were immediately transposed to heaven, where they could intercede on behalf of the living. Those who lived more mundane lives were believed to abide in a shadowy place of rest until the final judgment and the resurrection of the dead. A new landscape, however, opens before our eyes in the writings of Augustine, who dominates the book’s middle chapters. Inhabiting a far more thoroughly Christianized world, Augustine trained his gaze on the non valde boni—the “not altogether good” who made up the vast majority of his congregation. Their lives were marked by the accumulation of small debts—minor sins against God and their neighbors—that could be expiated through alms giving. Brown argues that it was Augustine of Hippo who ensured that subsequent generations of Western Christians would link repentance with the giving of money to the poor and the church. The second major movement is a complete re-imagining of the cosmos itself. In the ancient world, it was thought that the dead—or at least the elite or holy dead—ascended to the stars, where they enjoyed eternal life. Christians believed this as fervently as their “pagan” neighbors. But in the fifth century, especially in Gaul, the journey of the soul after death became a much more hazardous and uncertain proposition. It was no longer a rapid ascent, but a bitter struggle in a foreign landscape 114

dotted with demons and devils. The only hope of a successful journey lay in performing radical acts of giving in life and in soliciting the assistance of the saints and the living after death. If one could construct an elaborate shrine to a saint and ensure burial next to that saint, it was imagined that one’s postmortem journey would be easier and more likely to succeed. It is here that we begin to move toward the medieval understanding of purgatory, with Brown’s narrative ending on the cusp of its emergence. It would have been easy for medieval Christians, as Brown so eloquently puts it, to see how “some of that imagined treasure had, as it were, dripped back to earth,” in the form of great shrines and churches. And with little effort, we ourselves can continue to witness the legacy of this new vision of the cosmos and the afterlife, which encouraged the wealth of innumerable Christians deposited in heaven to drip back, sometimes slowly and sometimes rapidly, into the present world. Debates over the appropriateness of this great accumulation of wealth will surely continue unabated—as they have since before the time of Francis of Assisi himself. But Brown’s Ransom of the Soul provides a more nuanced, textured, and sympathetic view of how this system, strange though it may seem to us, came to be in the first place. Brown presses us to imaginatively inhabit a world in which the religious and the commercial do not constitute separate zones, but collaborate to hold the cosmos together. At the end of this remarkable tour, we may even be inclined to agree with our guide: “Perhaps it is we who are strange.” Karl Shuve is an assistant professor of religious studies at the University of Virginia. His first book, The Song of Songs and the Fashioning of Identity in Early Latin Christianity, is forthcoming from Oxford University Press.

BOOK REVIEWS

School of Athens (detail) by Raphael; Vatican Museums and Galleries, Vatican City/Bridgeman Images.

Right Reading Steven Knepper Literary Criticism from Plato to Postmodernism: The Humanistic Alternative James Seaton New York: Cambridge University Press, 2014.

James Seaton champions a tradition of accessible literary criticism that today is more commonly found in high-end periodicals and reviews than in university departments of literary studies. To adepts of this humanistic tradition, literature provides, in his words, “valuable insight into human life in all its variety.” Seaton traces humanistic criticism back to Aristotle and up through onceinfluential but now-neglected twentieth-century American critics. Among more recent contemporary practitioners of this sort of criticism we might count Marilynne Robinson, Wendell Berry, Adam Kirsch, and James Wood. Seaton, a professor of English at Michigan State University, contrasts this humanistic criticism with the approach encouraged in much modern and postmodern literary theory. He claims that politicized frameworks—theoretical cookie cutters like Marxism, feminism, and psychoanalysis—turn great novels, poems, and drama into little more than fodder for theory. They encourage students to be detached critics, insulated against the naive view that literary works offer any lasting insight into human life. The result, Seaton says, is predictable, jargonladen critique. His diagnosis comes at a time when scholars across the humanities are voicing similar concerns about the dominance of a smug, flatfooted hermeneutics of suspicion. Bruno Latour, for instance, has called practitioners of formulaic critique “critical barbarians,” and Rita Felski has examined the limits of suspicious reading in her 2008 book Uses of Literature. A 2013 report by Harvard College found that the humanities have overemphasized theory at the expense of

other approaches to literature, including literary history. Seaton contributes to such disciplinary soul-searching in two significant ways. First, he argues that the history of literary criticism can be understood as an ongoing battle among politicized Platonism, mystical Neoplatonism, and sensible Aristotelianism. Second, he rehabilitates some twentieth-century practitioners of this Aristotelian humanism—namely Edmund Wilson, Lionel Trilling, and Ralph Ellison— who still have something to teach literary studies today. Before weighing those contributions, I should note a significant difference between Seaton and scholars such as Felski. The latter works from within the loosely defined parameters of “theory,” drawing on phenomenology, hermeneutics, and strands of feminism to articulate a critical approach that is more open to the pleasures of reading and the insights offered by literary works. Seaton, on the other hand, is a seasoned veteran of the culture wars who has been publishing 115

THE HEDGEHOG REVIEW / SUMMER 2015

broadsides against theory for decades. So while Felski criticizes literary theory from within, Seaton criticizes it from without. This outsider stance is both a strength and a weakness. It positions Seaton to recover critics such as Wilson, Trilling, and Ellison who have been left out of the theory canon, but it also makes him prone to blanket condemnations of a diverse set of scholars, some of whom support the reforms he advocates. At their worst, to be sure, literary theorists pluck decontextualized dicta from Foucault and Derrida and present them as holy writ for a sort of poststructuralist catechism. But a good theory course can introduce students to debates about literature, language, aesthetics, ethics, and politics that do indeed stretch back to Plato. Likewise, not all theory-informed criticism runs roughshod over literary works, and some of it is even engagingly written. Seaton’s polemic tends to brush aside the claims of theorists rather than engage with them. In this regard, one might be better served by Felski or a radical humanistic critic such as Terry Eagleton, who, after writing what many consider the definitive book on theory in 1983 (Literary Theory: An Introduction), launched a thorough critique of theory’s excesses. In works such as After Theory (2003), Eagleton defends the persistence of the human condition as the justification for “humanistic” criticism; in so doing, he confronts postmodern challenges that are more formidable than Seaton acknowledges. Seaton offers a particularly one-sided treatment of the Frankfurt School theorists, who figure as the main villains in his account of the decline of literary studies. The reader learns that “the originality of the critical theorists derived from their willingness to ignore or discount all the economic, social, and political gains achieved in the twentieth century.” But one would never learn from Seaton that Walter Benjamin wrote a suggestive piece on the eclipse of oral storytelling or that Theodor Adorno had nuanced things to say about the politics of lyric poetry. A renewed 116

humanistic criticism need not categorically dismiss these German intellectuals. Seaton relies on similarly broad strokes in his attempt to sort all of literary criticism into three rival traditions: the Platonist, which is “suspicious of poems, plays, and fiction because they reinforce the prejudices and false consciousness of the unenlightened majority”; the Neoplatonist, which values literature “as a vehicle for moral and/or spiritual transcendence of conventional common sense”; and the Aristotelian, which offers a “middle way between the Platonic condemnation of art and literature and the Neoplatonic elevation.” This schema allows Seaton to point out some affinities between seemingly farremoved positions. It’s also a rhetorically effective way to situate humanistic criticism as the sensible alternative to two extremes. (Today’s politicized theorists are “Platonists,” by Seaton’s account.) But the categories are so capacious that they risk obscuring the history of criticism instead of clarifying it. Seaton fully acknowledges that he uses the notion of tradition loosely, as “a matter of affinity and tendency rather than explicit philosophy or theory,” and that this results in some odd bedfellows, but the very word tradition is perhaps misleading here. Seaton does not examine rival traditions so much as he groups critics who reach similar conclusions about the value of literature, regardless of their reasons for doing so. In Seaton’s schema, both a crude Marxist and Simone Weil, a true intellectual descendent of Plato who offers a striking reformulation of The Republic’s moral critique of literature, would be in the same camp. Seaton makes a much more compelling case for revisiting three twentieth-century critics who were influential in their day but seldom gain a place in theory anthologies: Wilson, Trilling, and Ellison. Seaton claims that Wilson “made the literature of high modernism available to the general reader not by slighting its complexity but by refusing to accept its technical innovations as ends in themselves and, instead, pointing to their human meaning as responses to modern

life.” Trilling argued that, in its complexity and ambiguity, literature “opposes the inevitable one-sidedness of all political doctrines, even the most benign.” Ellison, whose novel Invisible Man (1952) is widely taught but whose criticism is neglected, took up the tension between artistic standards and democratic equality. Seaton demonstrates that these humanistic critics continue to offer insight into literature and its relationship to society, and—perhaps more important—provide exemplary models of democratic literary criticism. While Seaton’s attempt to classify all critics as Aristotelians, Platonists, or Neoplatonists results in oversimplification, his take on the recent history of literary criticism returns important voices to the conversation. He also issues a timely call for well-written literary criticism that “conveys to the general public the pleasures and insights that poems, plays, and fiction continue to make available to all those willing to attend.” Whether it takes its bearings from neglected humanist critics or an immanent critique of theory, such criticism provides at least a partial tonic for what ails literary studies. Steven Knepper is assistant professor of English, rhetoric, and humanistic studies at Virginia Military Institute.

A Diminished Thing Chad Wellmon Higher Education in America Derek Bok Princeton and Oxford: Princeton University Press, 2013.

The University of Virginia, where I teach, does many different things. It runs a medical system with a 631-bed hospital, twenty-three research centers, and a medical school. It manages an

THR composite/Seamartini Graphics/Shutterstock, Inc.

BOOK REVIEWS

entertainment business that hosts everything from Pearl Jam concerts to monster truck rallies. It supports a start-up incubator for new business ventures. It coordinates a community-wide sustainability effort. It feeds thousands of people every day. It serves as an industry and government research center. It fields twenty-five varsity sports teams. And in addition to all that, it educates more than 20,000 students a year, while supporting the research of some 2,000 fulltime faculty members. Like most universities today, the University of Virginia isn’t just an educational institution. It’s a conglomerate. In 1963 in the middle of the post–World War II higher education boom, when college enrollments were surging and federal research dollars were flowing, Clark Kerr, then-president of the University of California, christened this new institution the “multiversity.” During the twentieth century, he argued, the university had ceased to be a unified community harnessed a single purpose. It had fragmented into several communities. This new institution, Kerr noted, was to be neither loved nor loathed; it was to be managed. Like any other modern institution, it was to be run by organization men and 117

THE HEDGEHOG REVIEW / SUMMER 2015

women, administrators who managed a panoply of corporate enterprises. The new, modern multiversity wasn’t the quaint collegiate community of the past; it was a “mechanism held together by administrative rules and powered by money.” And its leader, as Thorstein Veblen had put it a half century earlier, was not an intellectual leader but rather a “captain of erudition.” But even as Kerr defended the “multiversity” as the epitome of higher education, he acknowledged that its lack of a clear purpose made it susceptible to competing visions of a university’s purpose. Should the university primarily prepare students for the work force? Should it serve the public more broadly through technical advice and public service? Should it lead the advancement of new knowledge? Should it spur economic growth? Should it educate students to lead rich and full lives? In his book Higher Education in America, former Harvard president Derek Bok not only embraces Kerr’s “multiversity” but also insists that this institutional form, not the ones typified by the colleges of Oxbridge or the German research university, has become a “model for other countries around the world.” Indeed, he claims that the pursuit of multiple ends has allowed American universities to produce a whole that is “greater than the sum of its parts.” The dual imperative for faculty to teach and conduct research makes professors better teachers and better researchers. The pursuit of economic development and research allows universities to give their scientists access to industry databases and resources. The commitment to public service, whether expressed in undergraduates tutoring in local schools or professors advising government agencies, enriches the university. The university benefits from the “synergies,” both expected and unexpected, of its several pursuits. Its varied and multiple goals make it a better institution. And yet, like Kerr, Bok concedes that the pursuit of so many different goals can make the university a confused and conflicted institution. 118

As university interests and activities expand, so too do the administrative staffs that manage them. And as the number of deans, provosts, program officers, and budget officers increases, so too do the competing interests. The university becomes a corporation, with its many divisions vying for resources, prestige, and attention. Within the modern day multiversity, the job of university leaders is simply to manage, as Bok puts it, “a proper balance” among the institution’s multiple, oftentimes competing interests. But how among these competing ends and purposes is such a “proper balance” to be found? Bok occasionally invokes the “basic academic values of the university” as orienting guidelines. He implores trustees, presidents, deans, and faculty members to work together to preserve “basic” and “important” values. He notes the importance of “honesty” in research, “impartiality and disinterestedness in scholarship,” “freedom of thought,” and “quality teaching and research.” But he only vaguely alludes to what these are, where they come from, and how they are to be sustained. And it is because of this vagueness that one senses that these values have no real home in the multiversity. Of course, Bok insists that they must have a place in order to provide multiversity leaders with a common set of standards by which to reconcile their competing interests. This, however, is little more than a reduction of academic values to instrumental ends and even window-dressing. Without them, the multiversity Bok describes so well would more closely resemble General Electric than the venerable institutions of Paris, Bologna, or Berlin. The real home of these values is an institution that, sadly, may be on the verge of extinction: the research university or, perhaps more broadly, the academy, where the ideals and traditions dedicated to creating and sharing knowledge flourished. As universities divert their resources from more clearly academic ends, perhaps it’s time for academics to think about what historian Johann Neem calls an

BOOK REVIEWS

“academy in exile.” Consider, for example, the myriad German research institutes founded in the early twentieth century out of frustration with the bureaucratic behemoths that German universities had become. Maybe academics—now a pejorative term for the socially irrelevant and economically unproductive—should abandon the multiversity and their managerial masters and discover again what it’s like to think freely.

Laski Collection/Getty Images.

Chad Wellmon is an associate professor of German Studies at the University of Virginia and a fellow at the Institute for Advanced Studies in Culture. He is the author most recently of Organizing Enlightenment: Information Overload and the Invention of the Modern Research University (Johns Hopkins University Press).

A Prophet Restored James L. Nolan Jr. The Other Solzhenitsyn: Telling the Truth about a Misunderstood Writer and Thinker Daniel J. Mahoney South Bend, IN: St. Augustine’s Press, 2014.

Alexander Solzhenitsyn (1918–2008), the Nobel Prize−winning author whose writings did much to expose the atrocities of the communist system in the Soviet Union, was exiled by the Soviet government in 1974. Four years later, while living with his family in Cavendish, Vermont, he was invited to give the commencement address at Harvard University. Much to the surprise and chagrin of many, Solzhenitsyn took aim not only at the despotic system from which he had been exiled but at the flaws of Western democratic capitalism as well. Asking himself whether he “would propose the West, such as it is today, as a model to my country, ” he responded unequivocally: “I would frankly have to answer negatively.”

Solzhenitsyn in Russia, 2002.

To put it mildly, the lecture was not well received. From a number of camps, thereafter, both in the West and in the Soviet Union, Solzhenitsyn was viewed with suspicion if not outright derision. The author of The Gulag Archipelago has been variously accused of being anti-democratic, panSlavist, a Russian nationalist, an authoritarian scold, an anti-Semite, a theocratic tsarist, even a nostalgist for communism. Daniel Mahoney’s new book, The Other Solzhenitsyn, goes far toward debunking the caricature of Solzhenitsyn that has emerged over the past four decades, and demonstrates the courage, wisdom, and trenchant thinking of the man who first garnered worldwide notice with the publication of One Day in the Life of Ivan Denisovich in 1962. Some of the misunderstanding, according to Mahoney, a professor of political 119

THE HEDGEHOG REVIEW / SUMMER 2015

science at Assumption College, in Worcester, Massachusetts, stems from the lack of familiarity with Solzhenitsyn’s later works, many of which have yet to be translated into English. Mahoney highlights, in particular, Solzhenitsyn’s work from his years of exile and after his return to Russia in 1994, including Two Hundred Years Together, The Little Grain Managed to Land Between Two Millstones, Rebuilding Russia, Russia in Collapse, and his magnum opus, The Red Wheel. Drawing on these and earlier works, Mahoney makes a convincing case that the image of Solzhenitsyn constructed over the past four decades is a grossly distorted one. Mahoney shows, for example, that Solzhenitsyn was anything but anti-democratic. Rather, he was an advocate of “democracy in small spaces,” who urged Russians to establish democratic self-governance from the bottom up. As worthy examples of this model, Solzhenitsyn pointed to the local governing practices of Switzerland and New England, both of which he had witnessed firsthand. In addition to these models, he urged Russians to look to their own zemstvos—the small governing councils of local Russian provinces in the nineteenth century. “I have always insisted on local self-governance in Russia,” Solzhenitsyn asserted in an interview in Der Spiegel a year before his death. Given this view of democracy, it is not surprising that the Russian Orthodox believer was an admirer of Catholic social teaching and of Pope John Paul II, whom he met in 1993 and whose election in 1978 he had described as “a gift from God.” Solzhenitsyn’s view of democracy (and his criticisms of both industrial capitalism and socialism) was actually very much in keeping with the subsidiarity principle of Catholic social teaching and the distributist ideals advocated by G.K. Chesterton and Hilaire Belloc. While Mahoney touches on these “small is beautiful” themes, the affinities between Solzhenitsyn’s views, Catholic social teachings (beginning with Pope Leo XIII’s 1891 encyclical Rerum Novarum), and the writings of the English distributists are 120

developed more fully in Joseph Pearce’s biography, Solzhenitsyn: A Soul in Exile. In what may be the most intriguing section of the book, Mahoney explores Solzhenitsyn’s views on Russia’s last three major political leaders. Never much of a fan of either Mikhail Gorbachev or Boris Yeltsin, Solzhenitsyn was a qualified supporter of Vladimir Putin. This position no doubt contributed to misapprehensions about Solzhenitsyn and the view that he was an authoritarian Russian nationalist. Mahoney, however, demonstrates that here, as elsewhere, critics misunderstand Solzhenitsyn. Solzhenitsyn did give Putin credit for “his role in gradually restoring the strength and self-respect of the Russian people,” after the Russian leader had “inherited a ransacked and bewildered country, with a poor and demoralized people.” However, there were also aspects of Putin’s political leadership that displeased him, including the continuing corruption, the lack of public repentance for the crimes of communism, and the slow pace toward the development of democracy. That said, Solzhenitsyn also partially faulted the West for Russian resistance to democracy. After the instability of the 1990s—what Solzhenitsyn referred to as Russia’s “third time of troubles”— Russians tended to associate Western democracy with the widespread chaos and kleptocracy of the Yeltsin years. That Solzhenitsyn would appreciate Putin’s role in helping to restore the morale of the Russian people is consistent with Solzhenitsyn’s broader views regarding national character and of the role of Providence “in the collective lives of nations and peoples.” Solzhenitsyn placed high value on the distinctive qualities of a people. As he put it in his Nobel lecture, “Nations are the wealth of mankind, its collective personalities; the very least of them wears its own special coloration and bears within itself a special facet of divine intention.” Some of the boundaries established after the dismantling of the Eastern bloc were, according to Solzhenitsyn, arbitrary, and effectively made

BOOK REVIEWS

Russians aliens in the near abroad. In a 1994 interview, for example, Solzhenitsyn noted that Crimea was gifted to Ukraine in 1954 by Nikita Khrushchev “with the arbitrary capriciousness of a satrap.” While such views may suggest support for Putin’s recent actions in Ukraine, Mahoney insists that Solzhenitsyn, were he alive today, “would be more critical of Putin, especially of his refusal to give up power.” There is much to appreciate about Mahoney’s book, not the least of which is the fuller and more nuanced portrayal of the important Russian writer and thinker put forth in its pages. Readers of this illuminating and engaging book will be compelled to read more Solzhenitsyn and will look forward to the day when Solzhenitsyn’s more recent writings are finally translated into English. James L. Nolan Jr. is professor of sociology at Williams College and the author, most recently, of Legal Accents, Legal Borrowing: The International Problem-Solving Court Movement.

Return of the Repressed Jay Tolson The Paradox of Liberation: Secular Revolutions and Religious Counterrevolutions Michael Walzer New Haven and London: Yale University Press, 2015.

Michael Walzer has never thought small. The origins of radical politics, just war doctrine, equality, and toleration are among the topics on which the political scientist, a professor emeritus at the Institute for Advanced Study in Princeton, has shed invaluable light, usually by subjecting grand theoretical abstractions to the particularities of specific cultures, nations, and traditions. The paradox explored in this short book, which grew out of the Henry L. Stimson lectures

at Yale University, can be summed up in a single question: Why did so many states that gained independence in the post–World War II era and were founded on secular and democratic ideals soon face the powerful challenges of religious revivalism? Walzer’s inquiry into the inability of “the leaders and militants of secular liberation… to consolidate their achievements and reproduce themselves” focuses on three cases: Israel, where the secularist ideology of Labor Zionism now meets with powerful opposition from champions of a more messianic strain of Zionism as well as ultra-Orthodox Judaism; Algeria, where the secularist (and, briefly, democratic) ideals of the National Liberation Front have been repeatedly challenged and were nearly overturned by militant Islamists; and India, where the ambitious reform program of Jawaharlal Nehru’s Congress party has come up against the fervor and electoral successes of Hindu nationalists determined to assert their primacy within the constitutional order. To most leaders of assorted national liberation movements of the early postwar period, attaining independence and building modern states entailed the creation of a secular public sphere that, while not entirely dispensing with religious influences in realms such as law and education, moved them to the periphery. Nation-building elites, largely Western-educated (or at least educated in Western ideas), viewed religion as a waning force, to be supplanted by what Nehru called “the scientific outlook.” To these leaders, national unity and equality required subduing the divisive effects of religious and ethnic identity. To that end, Israel’s Declaration of Independence announced the creation of both a Jewish and a secular state, guaranteeing the right of religious and national minorities. Similarly, Walzer notes, “[India’s] Congress militants rejected Mohammed Ali Jinnah’s claim that the Muslims constituted a nation of their own just as they rejected the claims of Hindu nationalists.” It must be recalled, too, that the very idea of a nation-state ran counter to many religion-grounded suspicions. Early 121

Underwood Archives/Getty Images.

THE HEDGEHOG REVIEW / SUMMER 2015

Jawaharlal Nehru and Mahatma Gandhi, 1946.

Zionist thinkers, including Theodor Herzl, had to contend with a deeply rooted wariness about statehood derived from the long exilic experience of the Jewish people. If Westernized elites met with early success in moving their new states toward membership in the society of secular states, “which in its origins,” says Walzer, “was a European society but in its ambitions a global society,” they soon encountered resistance from militants who yearned for “a state shaped by their own interpretation or reinterpretation of a particular religious tradition.” In some ways, the growing influence of such militants was the doing of the secular elites, either intentional (by creating democracies in which alternative views could be voiced and advanced) or unintentional (by trying too hard to push religion out of the public sphere). It was no small irony that religious militants would attack the universalist regime of democracy and human rights as a Western import even while using it to advance their particularist agendas. Walzer makes no secret of his affinity with the secular orientation of the founding generation of these young states, but he gives ample voice to their critics, many of whom faulted the liberationist project for preserving the structures of the imperial state or lacking “concrete cultural content.” The latter created a vacuum into which 122

religious traditionalists and modernists, whether Hindu, Muslim, or Jewish, would inevitably rush. Walzer himself partly blames the ideological zeal of the secularist leaders for the vehemence of the various religion-inspired reactions: “It is the absolutism of secular negation that best accounts for the strength and militancy of the religious revival.” Ancient religions and traditions that are denied or suppressed return too often as modernist re-interpretations that, ironically, may be far less amenable to accommodation with pluralism, equality (including gender equality), and democracy itself. As a chastened liberal, Walzer hopes that political negotiation and engagement will lead to workable compromises between truly tolerant secular universalists and the best kind of religious particularists. Walzer’s brief excursus into one of the thornier problems of the globalizing world cannot respond to the many contradictions, differences, and exceptions that arise in the friction among various religious and secularist projects. Even the words religion and secularism take on strikingly different hues in different cultural and national contexts. How does one think of such large cases as China, for example? Did its Marxist variant on the national liberation project more successfully eradicate the “opiate of the masses,” with possibly deeply unsettling consequences for a society in which no one now truly believes the official (and secular) state ideology? And how does Russia’s current leader encourage and exploit an Orthodox revival to buttress Russian nationalism and his own autocratic form of “managed” democracy? However limited its scope, Walzer’s book—including its brief discussion of America’s difference—makes a convincing case that a purely secular state is an impossibility, and its hoped-for realization one of the greater mistakes of the progressive imagination. Jay Tolson is editor of The Hedgehog Review.

THE

POINT

“...Intellectually serious, independent, far-reaching, spirited and elegant—a stirring act of resistance against the shrinkage of intellectual life in our culture of takeaways and metrics. This is what a journal of ideas should look like.” —Leon Wieseltier

Question received ideas. Subscribe to The Point. www.thepointmag.com

IN ISSUE 10 (SUMMER 2015) Ferrante in America • De Botton’s School of Life The Failure Festival • What Happened to Queer Politics? Black Lives Matter & the Legacy of the Civil Rights Movement Believing in T. S. Eliot • What’s Good about Melancholy The Art of Decay • Against Honeymoons PLUS: “What is travel for?”

123

Critical Theory of the Contemporary

Timely. Provocative. Independent. Telos is a must-read for anyone with a serious interest in politics, philosophy, culture, and the arts. Subscribe now at www.telospress.com.

Since 1968, the quarterly journal Telos has served as the definitive international forum for discussions of political, social, and cultural change. Readers from around the globe turn to Telos to engage with the sharpest minds in politics and philosophy, and to discover emerging theoretical analyses of the pivotal issues of the day.

Telos Press Publishing PO Box 811 Candor, NY 13743 Tel: 212 · 228 · 6479 www.telospress.com

124

Subscription Rates Individuals: $80/year, plus shipping Institutions: please see our website for complete ordering information. ISSN 0090-6514 (p) · ISSN 1940-459X (o)

Signifiers

Narrative Wilfred M. McClay

Academia has a lot to answer for when it comes to the corruption and decay of our language. We all know about the impenetrable prose that has become academia’s stock in trade. But there are certain untoward developments that seem to be attributable to the rise of mass higher education, which has broken down the barrier between academia and public discourse, to the detriment of both. It is no coincidence that the years since World War II, which saw an astonishing rise in college enrollment have also seen a great many academic words and concepts finding their way into everyday speech. This is a process that has continued unabated, and it has nearly always tended to undermine the vigor and directness of our speech. So now, instead of changing our minds, we undergo a “paradigm shift.” Instead of finding something risky, it has become “problematic.” Instead of a fanciful story being called a fable or a tall tale, it is dubbed “an urban legend.” Instead of identifying one’s intimate partner as something more or less determinate, he or she is one’s “significant other.” Instead of being self-centered, the insufferable young man is “narcissistic.” And one could go on.

Some of these terms are older and more established than others, some are more pretentious than others, but they have in common their academic origins, and the fact that their everyday usage misrepresents their original meaning. Compare, for example, today’s use of “significant other” to its use by the psychologist Harry Stack Sullivan, the man who originated the term, who meant it to refer to the person who directs the primary socialization of a young child. Such technical jargon has a real value when it is confined to the discourse of specialized academic communities. But the flow of such words into our public speech is another matter. A special case in point is the word narrative. Although a word with a long history, and deep roots in Latin, it has by now become an academic term which has migrated into common speech, bringing hidden freight along with it. Elite journalists, who are likely to be products of university life, are perhaps the most likely to employ it, as a way of signaling their intellectual sophistication. But conservative populists like Rush Limbaugh and Sean Hannity are just as likely to use it, too, particularly in criticizing “the narrative” being

Wilfred M. McClay is the G.T. and Libby Blankenship Chair in the History of Liberty at the University of Oklahoma.

125

THE HEDGEHOG REVIEW / SUMMER 2015

put out by establishment politicians or elite media. Why is that so? What does this development mean? I think the answer is clear. The ever more common use of “narrative” signifies a widespread and growing skepticism about the general accounts of events that are being provided to us. We are living in an era of pervasive genteel disbelief— nothing so robust as relativism, but instead something more like a sustained cognitive shrug—and the word narrative provides us with a way of talking neutrally about such accounts while distancing ourselves from a consideration of their truth. Narratives are understood to be “constructed,” and it is assumed that their constructedness means that they cannot possibly be true, or false, since all such construction involves conscious or unconscious elements of selectivity, acts of suppression, inflation, and substitution, sleights-of-hand meant to fashion a rhetorical instrument that conveys what the narrator wants us to see and to believe. That this is a shallow and simplistic view of narrative ought to be obvious. But such an understanding opens up the possibility that anything that has been constructed can be reconstructed, and that we have it in our power to throw off the old narrative and envelop ourselves in a new and better one, with only the merest bow toward telling the whole truth. Asked by Vanity Fair magazine why she chose to resurface in public after years of silence, Monica Lewinsky explained, “I’ve decided, finally, to stick my head above the parapet so that I can take back my narrative and give a purpose to my past.” Now, I have considerable sympathy for Lewinsky, who did not deserve to become a worldwide object of sniggering derision. But one is entitled to wonder, based on her subsequent comments, and on the contents of her moving TED speech, “The Price of Shame,” whether one of the chief goals of “taking back her narrative” will be to downplay, and even erase, her own responsibility for her fate. I hope it doesn’t turn out that way. 126

If all narratives are equally concocted, they cannot be judged by any standard of truth, only by the extent of their effectiveness as therapy or public relations. Hence, the jaded commentary of our sophisticated pundits, who ask of politicians’ “spin” only whether it will “play” with the booboisie, not whether it is a tissue of lies. Hence, Lewinsky’s fond hope of reinvention: Ye shall believe the narrative, and the narrative shall make you free. But this very belief is self-undermining when we know that it is a “narrative” that we are believing, or pretending to believe. Trust the tale, said D.H. Lawrence, and not the teller. His words would seem to be especially apropos when the teller is oneself. When we turn narrative into a conscious instrument of our will, we will surely find it hard to show much reverence for the thing we have preemptively devalued. What is especially tragic about our era’s emptying-out of the word narrative in public usage is that it signifies not only systemic distrust but also the loss of narrative as a legitimate form of knowledge. This is an act of profound selfimpoverishment. There are stories that are true, in deep, fundamental, and enduring ways; and there are truths about human existence, and about the natural world, and human events, that can only be properly conveyed by means of stories—that is, by a narrative account of a sequence of events whose larger meaning is inseparable from our sequential apprehension of them. The stories that we share widely, particularly the ones we have shared over the centuries, are a very large part of who and what we are. We live in and through such stories, which allow our memories to reach back and our anticipations to leap forward, like searchlights that let us gaze into the darkness behind and peer into the darkness ahead. We have precious few other tools available to us for performing that act of discernment. We would be wise, then, to recover the power of narrative and set it free, rather than continue to keep it confined to the pinched indignity of scare quotes.

New Releases from the Faculty and Fellows of the Institute for Advanced Studies in Culture

Confronting Political Islam: Six Lessons from the West’s Past

John M. Owen IV Princeton University Press November 2014

Elites: A General Model

Murray Milner Jr. Polity Press January 2015

iasc-culture.org 127

What is Thriving Cities? An initiative of the Institute for Advanced Studies in Culture at the University of Virginia, Thriving Cities offers important insights for scholars, practitioners, and citizens in evaluating the wellbeing of their communities. Thriving Cities is committed to turning those insights into action-oriented tools that will empower key stakeholders—including foundations, city officials, city planners, religious leaders, politicians, educators, business people, academics, non-profits, and residents—to ask and answer the question: What does it mean and take to thrive in my city and how can I contribute?

Who is Thriving Cities? We are a group of unconventional urbanists, coming from many backgrounds and places, who believe that thriving will not be found through the usual strategies and tactics involving technology, money, and policy alone, but rather by situating these critical mechanisms in the context of history, culture, geography, and power. In short, we aim to fill a gap in urban thinking and practice summed up by the question: “What do the humanities have to say to the urban planner?” Out of this perspective, we are creating a conceptual paradigm for urban assessment and a toolkit for putting that paradigm into action. We believe working for the thriving of our communities is not only an empirical science, but also a moral, civic, and political art. Where to learn more info: [email protected] www.thrivingcities.com

iasc-culture.org

128

Go digital with The Hedgehog Review. Now only $10 per year!

THE BODY IN QUESTION

Christine Rosen the flesh made word David Bosworth the new immortalists Mark Edmundson body and soul Rebecca Lemov the data-driven body Gordon Marino lessons from the ring

WWW.IAS C-CU LTU R E.O R G

WWW. H E D G E H O G R E V I E W.CO M

WWW.HEDGEHOGREVIEW.COM

INSIDE

THE BODY IN QUESTION Christine Rosen, David Bosworth, Mark Edmundson, Rebecca Lemov, and Gordon Marino

ESSAYS Alan Jacobs on the witness of literature James McWilliams on the value of not knowing everything Ronald Osborn on the great subversion Johann Neem on the Common Core and democratic education

REVIEWS Karl Shuve on The Ransom of the Soul Steven Knepper on Literary Criticism from Plato to Postmodernism Chad Wellmon on Higher Education in America James L. Nolan Jr. on The Other Solzhenitsyn Jay Tolson on The Paradox of Liberation

Published by the

WWW.IAS C-CULT U R E .OR G