Nothing could be more midwestern than the fact that some of the greatest describers of the Midwest are identified much more closely with other regions. Marilynne Robinson comes to us by way of Idaho, France and Amherst; Saul Bellow we stole from the Canadians. William Maxwell was born in Illinois, but once you’ve edited fiction for the New Yorker, you’re, well, a New Yorker.
To that list, add the Southerner John Jeremiah Sullivan:
Given the relevant maps and a pointer, I know I could convince even the most exacting minds that when the vast and blood-soaked jigsaw puzzle that is this country’s regional scheme coalesced into more or less its present configuration after the Civil War, somebody dropped a piece, which left a void, and they called the void “Central Indiana.” I’m not trying to say there’s no there there. I’m trying to say there’s no there.
There’s another dropped piece further north, in Michigan, and it extends from Muskegon to Traverse City. (Beyond that, you’re getting close to the UP, which, congenial to you or not, is very definitely Somewhere.) This passage, later in the same essay, brought back memories of my last, sad post-college visits to Alma:
When I was about seventeen, I drove back to Indiana … To a man, our old chums divided along class lines. Those of us who’d grown up in Silver Hills, where kids were raised to finish high school and go to college, were finishing high school and applying to colleges. Those who hadn’t, weren’t. They weren’t doing anything.
He describes his friend Ricky, a “white-trash genius” as a child whom he finds
sitting alone in a darkened room watching a porn movie of a woman doing herself with a peeled banana. He said, “What the fuck is that thing on your head?” I was in a bandanna-wearing phase. This one was yellow. He said, “When I saw you get out of the car, I thought, ‘Who the fuck is that?’ I ’bout shot you for a faggot.” We asked him what was going on. He said he’d just been expelled from school, for trying to destroy one of the boys’ restrooms by flushing lit waterproof M-80s down the toilets. … This entire conversation unfolded as the woman with the banana worked away. Ricky’s dad was asleep in the next room. Retired now. We told him we were headed over to Brad’s next. He said, “I haven’t seen Brad in a while. Did you hear he dorked a spook?” That’s what he said: “Dorked a spook.”
Several ways this paragraph feels painfully true to me:
1) The macho truculence, so well-worn and immediate that it seems to function as its own code of manners;
2) The way illicit teenage thrill-seeking (the woman with the banana) turns into a routine as dull as any eight-hour job;
3) The suggestion of an inescapable, sprawling, mutually hostile family (two paragraphs ago, Sullivan was describing the same Ricky running in fear of this same father who now sleeps in the next room);
4) The fact that this kid was, and is, intelligent as well as mean-spirited: it’s a waterproof M-80;
5) The racism and homophobia, which are absolutely not incompatible with a kind of alienated jittery adolescent-boy brilliance. (Heck, the history of literature proves that: look at Celine.) Re: #5, in my town, and my age cohort, it wasn’t that weird for a kid, circa eighth grade, to discover both the Doors and David Duke at around the same time, ca. 1991.
All of that could have been me, but for two things: I’m a fairly timid person by nature, and my parents read. As a result, I mostly didn’t get along with the kids whose social class matched mine (even an eighth-grader knows what money is): I had no bloodlust and beating me up didn’t really win you that many points. (The main exception to this rule became my friend because, out of pity, he stuck up for me one day in the lunchroom, and got hit in the face for it. He remains, twenty-two years later, one of my favorite people on earth.) I was too socially awkward for most of the kids who, like me, were ultimately headed toward college. (My guidance counselors knew it long before I did. And even at that, without some help from my aunt and uncle, it likely would not have happened. Today, with tuition being what it is, it definitely wouldn’t: I’d have had to settle in at Wal-Mart two days after graduation, and maybe, after saving some money, grabbed a teaching degree at Central Michigan. I’d probably still be there.)
The topic of Sullivan’s essay, by the way, is one Bill Bailey, known to the world as Axl Rose. Among the people I’m thinking of, the clever, hostile children of the post-postwar economic boom, the time when the Rust Belt began to flake apart, no singer was ever more adored. (I thought they were Satanists.) In the essay, Sullivan returns again and again to the subject of Axl’s looks during the sad post-Chinese Democracy period: “To me he looks like he’s wearing an Axl Rose mask … he reminds one of the monster in Predator, or of that monster’s wife on its home planet.” He cycles through a few more funny metaphors, scoring some solid points on that stupid hockey jersey and those ridiculous white-boy dreads, but what struck me was: Axl Rose, these days, looks ghostly, like a man being reclaimed by Nowhere.
One of the first things I remember reading in n+1 was Joshua Glenn’s “The Argonaut Folly,” which talked about the tendency among proprietors of culturally and intellectually serious, marginal literary journals to fantasize about living in a commune with their favorite contributors. (For a more recent example, see some of Jessa Crispin’s recent Bookslut postings.) “For such dreamers,” wrote this former editor of Hermenaut, “merely collaborating with peers who possess skills as unique and impressive as they know their own to be isn’t good enough. Like the Argonauts, they want nothing less than to live and strive together, each and every day.”
This essay spoke to a deep and persistent emotion in me. In college, I dreamed of starting a magazine with, and living in a house with, my favorite fellow staff members of the Calvin College Chimes. Together we would rouse contemporary Christianity from its dogmatic slumbers, we would champion the outre and the marginalized, we would read and drink Yellowtail, that Target Superstore of wines, cheap without being depressing. The funny thing is that this wasn’t only my fantasy: both the magazine and the house we would live in (somewhere among the crumbling mock-Victorians near downtown Grand Rapids) were spoken of in reverential tones by a features editor I used to know. Nathan Vanderklippe, the editor-in-chief that year, would shake his head and say, “Well, I don’t know about this house stuff, but the magazine’s a great idea.” Neither ever happened.
There’s a deep strain of this sort of communal utopianism in American intellectual and literary history: think of all those tiny radical Protestant colleges that Marilynne Robinson writes about, all those mini-communist republics erected on farms in the nineteenth century. (Glenn himself makes this connection, via Brook Farm, that Fourierist folly that inspired Hawthorne’s novel The Blithedale Experiment.) Hadn’t my hero Lester Bangs done essentially what I dreamed of doing, not three hours’ drive away, in Walled Lake, Michigan, where the editors and writers of Creem Magazine set the terms for the next quarter-century’s critical discussions of popular music while sharing quarters and drugs? (Yes, he had, and it was a nightmare of poor hygiene and bad labor practices, with repercussions that continue to this day. But we were young, and we would be better.)
I think this fantasy must account for some of the devotion with which many low-circulation literary journals are both produced and remembered. For those of us who dream of working with our dearest friends to raise life to the point where it’s as awesome as literature is, of making our lives a demonstration that the values we write about (and through) (and toward) are actually livable, with a little help and commitment, the print runs of n+1 and Creem, or (for my undergraduate, pissed-off-fringe-evangelical self) of the old Yaconelli-era Wittenberg Door, aren’t just important for the quality of the writing: they represent all the Utopianism you’ve ever allowed yourself.
Few journals seem to be more fondly remembered than Dwight MacDonald’s politics., which takes these sorts of hopes to their logical conclusion: it was the product of Dwight MacDonald’s marriage. (That’s another sort of community I have tended to invest my utopian desires in.) He and his wife Nancy (I worry that she got stuck with all the boring stuff) put it together in their house, like Leonard and Virginia Woolf running Hogarth Press, and if they didn’t assemble a commune, they are credited with the creation of the “politics. milieu,” or the “politics. circle,” as the title of one critical study puts it. That’s what I want: a circle! They’re inclusive, safe, and, in magic, always protective.
The old politics. essays and other stray work by MacDonald assembled under the title Memoirs of a Revolutionist show exactly why a person would want to be part of the politics. circle, and why such a circle could hardly stay intact. It begins, after all, with “Politics Past,” basically MacDonald’s political autobiography, which is a tale of broken circles: the Trotskyists who broke off from American communism more generally and then broke with each other, till poor MacDonald had no recourse but to quit Partisan Review (the magazine for mainstream Trotskyists) and form politics (for disaffected anarchist former Trotskyists like himself and, well, himself). In one passage, he writes of a group in which “The ‘ites’ dropped off one by one until the Revolutionary League of America or whatever it was called—the title generally made up in scope for any restriction of numbers—consisted of the leader and his wife.” (I am reminded of my father’s observation about Pentecostal churches, that the longer and more grandiose the name is, the smaller the body of the faithful hiding under it.) “Politics Past” is an invaluable guide to the various schisms in the American prewar left, because it is the only piece I have ever read on this subject that seemed to have been written by a human being and not a Marxist spambot.
MacDonald’s humanity—his humor, his irony, his chastened expectations of himself and others—is easily the most appealing thing about him. It underlies his rejection of the idea of collective guilt in “The Responsibility of Peoples,” a great, penetrating essay in which he excoriates the left-liberal rhetoric that made all Germans guilty for the crimes of Naziism while leaving Allied hands free of Dresden blood. (A bystander’s failure to martyr herself, he very sensibly observes, is a different thing from actual participation in war crimes.) This essay is worth pondering by any of us who have ever wondered whether the constant refrain of the American left—that we are responsible for the actions of our government—merely reinforces the kind of low-level, nagging self-hatred that prevents a person from taking any effective action at all.
At the same time, MacDonald is hardly the person to captain a stable team. To write like he does, you need both a stubborn sense of intellectual honesty and the slight vanity that makes a person strive for originality of expression in the first place. Perhaps, to paraphrase Diogenes, no group of good writers could live with themselves for companions. It would be too inconvenient.
So, who wants to talk about Victor Hugo’s prose style a minute? C’mon, it’ll be fun.
I have never read Hugo before. I started Les Miserables once during a holiday and really liked the first thirty-three pages. Then, as so often happens with books I start on holidays, I got distracted by the possibility of a B-SF movie marathon with my dad. I have never even been tempted to try The Hunchback of Notre Dame, though now I’m reconsidering my skepticism. And, since I’m an English-language reader, that is almost all I knew of Hugo, though I also picked up somewhere along the way that he’s considered one of the great innovators of nineteenth-century fiction, that in France his poetry is also canonical, that he was once a big enough deal that, when someone asked Charles Baudelaire who the greatest living French writer was, he sighed and said “Victor Hugo, helas” (and then probably went back to fumbling his rosary with fingers dipped in the blood of a syphilitic prostitute or something metal like that). Here’s a picture of Hugo’s funeral:
Yup: all those people turned out for a writer. And this leads to the other thing that a literature junkie knows about Hugo without having read him: that his fame, his political and cultural influence, were so outsize that his very funeral has become, for subsequent generations of critics, a stand-in for exactly the kind of social power a person is now assumed permanently to forego on choosing to become a writer.
So I knew a few things about him without knowing his work. I chose 93 (short for 1793) because the period interested me, I’d never heard of the book before, and I am on a kick right now of reading books that have numbers in the title. (True fact. True, sad, OCD fact.) I also want to practice my terrible French, so I’ve been listening to the free French audiobook while glancing at both the English and French etexts, pausing the soundfile, of course, every other sentence.
In other words, I was expecting difficulty, but I’m still thrown by the oddness of the way Hugo writes.
In the final days of May, 1793, a Paris battalion—one of those dispatched to Bretagne by General Santerre—found itself in Astille, inspecting the infamous woods of Saudraie. All told, they were no more than three hundred men—for such was the time, that moment just after L’Argonne, Jemmapes, Valmy, when even the First Parisian Unit, six hundred strong men, found themselves down to twenty-seven, the Second Unit to thirty-three, the Third Unit to fifty-seven. A time of epic battles.
This batallion, when they had come from Paris into the Vendee, had comprised nine hundred twelve men. They came with three cannons and moved rapidly, on foot. On April 25, the council, with Gohier for its minister of justice and Bouchotte for its minister of war, had proposed sending volunteers to the Vendee; Lubin of the Commune filed the report; on May 1, Santerre had already mustered two thousand, plus thirty pieces of gear and a line of cannonmen. These batallions, made so quickly, were nonetheless made so well that they served the French of ninety years hence as models; it was after their manner that future companies of the line were composed, their proportion, between the number of soldiers and the number of NCOs, that survived.
The ordering of information seems strange to me. Hugo seems to be telescoping back and forth between the battalion (the eventual subject of this opening chapter) and the larger political context (“on May 1, Santerre had already mustered…”). But he’s doing it in a way that seems awkward. The second paragraph starts us off right there with the unit (who are back to 900, though we’ve just learned that at the moment when the scene takes place, they’ve been reduced by two-thirds), and then abruptly backs up, like one of those people who just can’t give directions: “And then you’ll turn right at the light—no, hold on a second, that light, they voted to take it out at the last City Council meeting, maybe it’s already down, you know how they forget to do things sometimes, but then I guess it all depends on whether they could requisition the repair man, you know, to get out there and actually take it down…”). By the end of the paragraph he’s looking forward a century: Oh, and by the way, these were just the sorts of battalions that later became the model for all French battalions. It’s all interesting information, but I’m weirded out by the order in which he gives it to us.
A few paragraphs hence:
Now the battalion engaged in the Saudraie woods held their guard. They didn’t hurry. They looked left and right, behind and front, at once (Kleber once said: “Soldiers have eyes in the back”). How long had they marched? What time was it? How much day was left? You couldn’t say, it was always a sort of night in those wild thickets, it was never clear in the woods.
Except for the surprise quotation from somebody named Kleber, presumably this Kleber, who at least is from the same period, this is good, tense, psychologically believable prose. And then immediately:
Those woods—they were tragic. Here in these depths, in November 1792, the civil war slit its first throats; Mosqueton, that crazy cripple, had walked out of these shadows; the sheer number of murders that had taken place here would stand your neckhairs erect. No place more appalling. Carefully, the soldiers dug in. The forest was full of flowers; on every side of you, a scrim, almost still, of branches, giving off a smell of fresh leaves; the sun’s rays—there! And there!–pierce green shadows; the gladiola, that marsh-flame, and narcissus nearby, and the daisy that foreshadowed spring, and the spring’s crocus, a border and embroidery on a tapestry of green, and growing on it all every face of moss from the one that looks like velvet all the way to the ones that twinkle like stars: this the soldiers took inch by inch, silent, just touching the brushwood aside. Above their bayonets sang oblivious birds.
Again: I almost love it, with the birds singing far above, but the sudden backup into history-book prose is once again a little jarring. I can believe that the people who a paragraph ago were thinking “It was always night in those thickets” would be thinking about the woods’ recent, murderous history (as a person walking near a certain neighborhood in Milwaukee involuntarily thinks of Jeffrey Dahmer), but not with that pedantic specificity: “In November 1792.”
I think it has to do with Hugo’s … scratch that, I’m really not sure what it has to do with. I don’t know. I’m interested enough to continue with the book, and I’m sure there’s a reason for this odd mixture between pedantry and close observation; I’m just not sure what it is.
I got the idea from somewhere that it was safe to skip Mary McCarthy. The next time I run into Somewhere, I’m going to bop him on the nose. Not only is McCarthy the author of that immortal quip, “Every word she writes is a lie, including and and the“—deployed against the deserving Lillian Hellman—not only was she Edmund WIlson’s wife, not only was she (speaking of noses) the sister of the actor who warned audiences in two separate iterations of Invasion of the Body Snatchers that they were next, and who, in UHF, likened his town to “a festering bowl of dog snot!!” (to speak again of noses); she is also the most pleasurable critic-essayist I’ve read in months. In fact, the last time I had this much fun reading old dispatches from Serious New York-Based Journals of Opinion, it was March, and I was reading Renata Adler.
It’s no accident that this was the year many readers rediscovered both Adler and McCarthy. On one hand, there is the academic and journalistic enthusiasm for rediscovering women writers—an enthusiasm that these particular writers justify in every possible way. (And it barely seems possible that sexism played no role, or even a small role, in McCarthy’s partial fall in reputation: I didn’t fail to notice, a minute ago, that most of what I knew about her before this week had to do with men, her husband and brother.) On the other hand, there was the reprinting of Adler’s Speedboat and Pitch Dark and the fiftieth anniversary of the publication of McCarthy’s bestselling novel The Group, a birthday observed via a series of thinkpieces that somehow always wanted to link The Group to the work of Lena Dunham. (Which is fine. Unlike most of the people I follow on Twitter, I like Dunham’s work.) But the other factor that makes a McCarthy renaissance inevitable is one that, wearing the costume of another era’s references, turns up in one of On the Contrary‘s best essays: the debate among readers and writers about the future of written fiction.
What’s been happening the last few years is that we’re in the middle of a great age for nonfiction writing that has somehow misunderstood itself as the novel’s death rattle. I’ve always found the Reality Hunger line of argument ludicrous: why should I give up roomy social novels just because David Shields has a crush on his iPhone? It’s just more flabby futurism, accompanied by one of the futurist’s most irksome locutions: Shields often labels the kinds of long, detail-rich, non-fragmentary, made-up novels he dislikes “dinosaurs,” which is both to express the hope that a meteor will wipe out these writers’ ecosystem and to assert that one already has. Since the former is merely hateful discredited 19th-century style progressive evolution, and the latter not evidently true, I have trouble escaping the feeling that Shields is just being a savvy self-advertiser. I have the same suspicion about essay booster Jon D’Agata, whose idea for rejuvenating nonfiction is to have it be, well, fiction.
Still, when everybody’s talking apocalyptically enough, my mood does not remain wholly unaffected. So when I began reading McCarthy’s “The Fact in Fiction” (1960), and encountered the following complaint, I felt as if I were recognizing a contemporary:
Now I am not going to talk about the problems of the novel in this sense at all but rather to confront the fact that the writing of a novel has become problematic today. Is it still possible to write novels—in longhand or on the typewriter, standing or sitting, on Sundays or weekdays, with or without an outline? The answer, it seems to me, is certainly not yes and perhaps, tentatively, no…almost no writer in the West of any consequence, let us say since the death of Thomas Mann, has been able to write a true novel; the exception is Faulkner, who is now an old man.
From here, the argument is Philip Roth’s “American reality outpaces fiction” idea all over again. Great novelists were, by and large, enthusiastic handlers of facts, deliverers of news (novel bits of information): we might say the real novelist is marked by a ravenous reality hunger. In a nuclear-armed world, the little particularities that make up a life seem both ridiculous and a little immoral, like an expensive and time-consuming sexual fetish: “the existence of Highbury or the Province of O– is rendered improbable, unveracious, by Buchenwald and Auschwitz, the population curve of China, and the hydrogen bomb.”
Even those of us who try to avoid newspaper cant have felt this, only today it’s the Internet that makes all our particularities feel unreal, mere bits: why assign significance to the texture of this fact against your little finger when nearly every electronic device in your house is linked, in effect, to the Infinite? When you have everything, what’s the status of anything? Moreover, when everything is, not only present, but linked (in all the senses the last twenty years have given that word) to everything else, it messes with one’s sense of causality, which is the thing that the hefty 19th-century novels tended to study. So we have a lot of great writing, much of it nonfictional (whatever that means), that deals with the consequences of being bits of information inundated by more bits of information; at the same time, we have a sudden jones to be reading pop-science and pop-history all the time, so we won’t miss the next Giant Cause of things that is hiding behind the mere profusion of possible Giant Causes. It’s a good time to be Geoff Dyer, but that doesn’t have to mean it’s a bad time to be Jennifer Egan.
(Or another way to put this: As much as we all love Twitter’s Teju Cole, he’s not more compelling to me than Open City author Teju Cole.)
But when highly eloquent people feel the need to be explaining things to themselves all the time, you’re going to have some great nonfiction writing. (Notice that the period of the so-called “rise of the novel” was also, basically, the Age of Johnson—and a period of intense technological change, population growth, etc.) I’m not convinced yet that the current wave of good nonfiction writers means that there’s been a corresponding attenuation of the novel, and I’m damn sure it doesn’t mean the “death of a form” that basically has no fixed form. But if you wanted to rescue that thesis, you might point out that, in retrospect, it does seem clear that most of the best US writers of the late ’40s and ’50s did their best work in the essay: Baldwin, Vidal, Mailer. Reading On the Contrary made it clear to me that Mary McCarthy belongs to this group at least as much as she did to the one that she wrote her most famous novel about.
This is a polemic against MOOC futurism particularly and economic neoliberalism generally, swirled up with a deeply personal essay about my own intellectual history, with a cherry of love for my alma mater on top. The link expires in a month or so, but I’ll see if the nice Canadians will let the piece live here afterward.
Update: Now reposting the thing below, because the Courier changes their front page every month or so and doesn’t archive. And neither, when it comes to small Calvinist Ontario-based newspapers, do many US libraries.
Calvin College, MOOCs, and Bodies in an Actual Room
By Phil Christman
Fundamentalists are often assumed to lack any sort of intellectual life or culture. In my experience, this isn’t true. My father is both one of the smartest men I’ve ever met and a fundie right down to his (unevolved) toenails; and he taught me, by word and example, that research, argument, and thinking were important, because they could help in serving the single goal that fundamentalism proposes to hang over all of life: the making of converts.
By the time I was ready for college, this view of life had begun to cramp me. To be told, again and again, by my fellow churchgoers, that my interest in writing was valuable because (and, it was implied, only because) it might help “win souls” was to feel something I loved, again and again, reduced to an instrument. The Susan Sontag for fundamentalists of my generation—the person who explained modern literature and art to us in ways that would make a more general appreciation possible—was the conservative Presbyterian thinker Francis Schaeffer, apparently a generous and cultivated man in his private life but, in his books, little more than a cataloguer of ways in which Kant or Walt Whitman or the Beatles got God wrong. At eighteen, I didn’t want a list of thoughts to avoid, but a list of thoughts to explore. More than that, I wanted what fundamentalism, with its single-minded focus on salvation as a moment and not a way of life, could never give me: a rich account of how such exploration might relate to the God revealed in Jesus.
I gained a lot of things by going to Calvin College. But the most important was that explanation: We seek to know because we are made in the image of the ultimate Maker and Knower. Knowledge is an end, not a means.
That answer is not the world’s answer. The world tells us that we learn to accrue social capital; to further the Revolution; to increase likelihood of reproductive success; to make more money. In the US, all of these assumptions are often subsumed under a noble-sounding one: the end of education is to democratize society by increasing social mobility.
Now, I think social mobility is a perfectly good reason for a society to invest in higher education. Some critics scorn it as a “materialistic” concern, but those people have likely never spent a summer cleaning grease traps at a suburban Burger King. I am certainly grateful for the social mobility that Calvin gave me, which saved me from a constrained working-class existence in the cultural backwater of central Michigan. (I must add, though, that I wouldn’t have gained so much of it if I hadn’t been primarily—indeed obsessively—interested in my books. Life often works like that: If you shoot for the stars, you may just get as far as the moon.)
When, in April of this year, a California legislator proposed that students at state schools be allowed, in certain cases, to earn college credit by enrolling in Massive Online Open Courses (MOOCs), we heard quite a bit about how such programs will “increase access” to higher education, and, thus, social mobility. (That they could also devastate an entire industry was glossed over. So was the fact that, in some analyses, universal free public higher education turns out to be surprisingly cheap.) Now, MOOCs in themselves are not bad. I’ve taken a few myself. You log in, watch a few video lectures, do some problem sets, take a quiz. They are better than nothing, in the same way that a phone call from my wife, when she’s out of town, is better than talking to a wall. But the idea that MOOCs can replicate the higher education experience makes no more sense than the idea that phone calls alone constitute a marriage.
My Calvin education can hardly be reduced to a series of movie clips and online quizzes. It was late nights (and early mornings) in the college newspaper office. It was reading a particular assignment very carefully because I wanted to impress a female classmate with my arguments. It was listening to Dr. Anker lecture outdoors on a pollen-filled spring afternoon about Henry David Thoreau’s sacred anger against the strictures of polite society, until one guy decided to embody Thoreau’s point with an impromptu dive into the sem pond. It was Dr. Ward weeping, every two years—he couldn’t help it—when he lectured on the ending of Charlotte Bronte’s Villette. It was Dr. Saupe buying me a sandwich because she thought I looked a little thin. It was Dr. Sterk shepherding me through my first-ever academic conference. It was Ward, Saupe, Sterk, and Dr. Fackler steering me toward open jobs that were right for me.
It was Dr. Felch calling me in to her office when she learned that I’d been suffering from anxiety attacks, and talking with me for three hours.
These sorts of experiences just were my college education. And they were, not incidentally, the source of the social mobility I mentioned earlier: those resume-building freelance jobs would not have been made available to me by a professor I knew only in the way that I know my Twitter followers.
MOOCs only do half the job a good college course does, when it comes to education; when it comes to improving a person’s economic prospects, not even one percent. What MOOCs do make possible is the application to higher ed of a paradigm that dominates much of US life: they allow state money to be funneled, however indirectly, to the private investors who own most of the established MOOC providers. Let me be clear: I’m not accusing anyone of venality. US intellectuals and policymakers now believe in these sorts of public-private wealth transfers with the same disinterested fervor that they once believed in beating communism. We believe in markets more than we believe in education.
Which brings me back to Calvin College.
The details of Calvin’s financial scandal aren’t entirely clear yet. But if I were going to bet on anything, it would be this: nobody involved in this mess acted from personal greed. Even the revelation that former Calvin President Gaylen Byker served on the board of an oil company that was, by some subsidiary-within-subsidiary corporate relationship of the kind only lawyers understand, also linked to the hedge funds Calvin used, doesn’t make me think anyone was being greedy. When profit-making is assumed to be a legitimate goal of any and every human activity, even religious schools, why not use your loan-payback savings to score a few extra dividends? Money is useful and more money more so. It’s a perfectly natural decision in a culture deeply convinced that profit-seeking is the fundamental human activity.
Nineteenth-century US Christians brought books and tools to swamps and mudholes and built tiny religious colleges all over the Midwest, then admitted women and African Americans in defiance of every social norm. They didn’t do this because they were rational utility maximizers. They did it because they believed education would protect people against enslavement (which is, I might mention, the ultimate social mobility), and because they believed learning glorifies God. One of the things I learned at Calvin is that educators who genuinely believe this tend to do quite a bit both for students’ minds and their prospects. I worry future children won’t have the option of learning that fact from Calvin. I know they’ll never learn it from a MOOC.
Part of the addictive quality of certain writers is that they point you, not only to whatever is interesting or new in their own work, but to an entire secret library. It’s not just that canonical writers often have fascinating noncanonical friends and mentors; it’s that the canon itself is so vast that you can’t find your way through it without a whole series of little trail maps, each one starting with someone you already love. Small example: I probably would have run across Joanna Russ’s name sooner or later, but Samuel R. Delany’s writing about her gave me both the sense of urgency that actually gets a book read—the feeling of Whoah, that sounds awesome that gets you to read We Who Are About To, not five books from now (by which time you may already have forgotten about it), but now—and something to look for in it.
One of the many—many—glories of C.S. Lewis obscured by his current joint status as Increasingly Disreputable Children’s Author/Only Worthwhile Modern Author Found in Christian Bookstores is that he’ll point you to the strangest and most wonderful things. He’s literally the only reason I know who George MacDonald is. His was the first serious recommendation of John Ruskin that I ever read. And there’s certainly no way in hell that Charles Williams’s weirdly sexual Jesus-magick potboilers would still be in print if Lewis hadn’t provided his publishers with such useful blurb copy.
Any reader of Lewis’s will know the name Owen Barfield—and, until this week, the name was nearly all I knew about him; that, plus Lewis’s famous description of their friendship (from Surprised By Joy):
The First is the alter ego, the man who first reveals to you that you are not alone in the world by turning out (beyond hope) to share all your most secret delights…But the Second Friend is the man who disagrees with you about everything. He is not so much the alter ego as the antiself. Of course he shares your interests; otherwise he would not become your friend at all. But he has approached them all at a different angle. He has read all the right books but has got the wrong thing out of every one. It is as if he spoke your language but mispronounced it. How can he be so nearly right and yet, invariably, just not right?
Arthur Greeves was Lewis’s first friend; Barfield was his second.
Aside from the sexism of the pronouns, I find this schema reasonably useful, although there are close friendships of mine that it doesn’t fit at all. Certain of my friends (Adam, Ali, Ben) like most of the same stuff as me, in mostly the same ways. Others (my dad, or Matt Simmons) also like that stuff, but they like it all wrong, thus putting me under a salutary sort of pressure to explain myself. Both groups make me a stronger thinker and better person (given the constraints imposed by the initial material). And my relationship with my wife partakes of elements of both.
History, Guilt, and Habit was the first book I’d ever read by Barfield, and every so often I ran across notions that I knew well from Lewis. (The philosophically sound, but for some reason academically unfashionable, idea that inseparable things can still be usefully distinguished from each other was one. Another was Barfield’s strong and useful awareness of the present as just one more time period, characterized by its own habits of thought and their limitations.) These ran through the text like snatches of a familiar pop song in an utterly foreign country. Barfield is very much his own man, and he’s fascinating.
The book is basically an attempt to take seriously two ideas that are totally compatible, but rarely seen running together in the wild: evolution and human exceptionalism. On the one hand, Barfield is not afraid of the idea that we share a common ancestor with our simian friends. On the other, he insists that human consciousness is not merely “a bit stuck on” to nature; and the book begins with a meditation on the differences between the concepts “evolution” and “history”:
Evolutionists and historians must think differently. The knowledge they are aiming at is knowledge of the macroscopic world. And that—as I have been emphasizing—is unquestionably both subjective and objective. When a historian, for instance, talks about Caesar crossing the Rubicon … he does not mean by the word “river” simply a combination of oxygen and hydrogen; he means a rich assemblage of qualities like coldness, gurgling, flashing in the sun. When he is describing Napoleon’s retreat from Moscow, the word “cold” does not signify a thermometer reading; it means the felt quality “cold.” …[T]he macroscopic world is compounded of qualities, as well as quantities. It is therefore what we perceive and, accordingly, is inseparable from what we think. I hope my vocabulary is not irritating anyone. It doesn’t matter what you call it…as long as you remember that what we perceive is the actual world, and not a kind of shadow-show pretending to be the actual word (with, of course, an actual world, consisting of particles, mathematical equations or something of the sort, hiding behind it).
So on the one hand Barfield begins by rejecting philosophical materialism—all our knowledge is, one way or another, the product of human perceptions (even a math formula must be read by someone), and cannot be reduced to cold data. (To speak of doing so is already to introduce numberless metaphors. Why is it “cold”? What “gave” it? Etc.) As he repeatedly insists, what we perceive is “structurally inseparable from what we think.” He elaborates on this latter point:
You may remember William James’s supposition of a confrontation between, on the one hand, the environment…and, on the other, a man who possessed all the organs of perception, but who had never done any thinking. He demonstrated that such a man would perceive nothing, or nothing but what James called “a blooming, buzzing confusion.”
Consciousness is all. But not all, because Barfield is sufficiently aware that human life has a history—a history that begins billions of years before consciousness does, a history that can be traced back to overexcited cyanobacteria in a chemical soup—that he profoundly believes that the nature of human consciousness has changed. Basically, we have separated ourselves from nature. When Aristotle and his contemporaries talked of the four elements, Earth, Wind, Fire, and Water, this “was much less a perception of something detached from themselves, much more a perception of qualities and processes to be found within themselves, within their own consciousness, as well as without in nature.”
Barfield insists that we can see this by analyzing the history of words and of literature, though not before acknowledging nearly-insuperable difficulties in doing so: “You can dig into the earth with a spade…But if, by some mysterious dispensation, the spade were part of the very path of earth you were splitting up, you would be rather nonplussed.”
But, in any case, it’s Barfield’s thesis that languages were once a good deal more figurative, and that our attempt to make them fully conceptual is not only doomed, but, in the meantime, is stupefying us, in part by making us less able to “think ourselves into” earlier philosophers. Undoing this flattening of meaning would seem to be the work of imaginative readers and writers. You can see why so many poets have adored Barfield.
There’s much more here, and apparently the three lectures contained in History, Guilt, and Habit are intended as a sort of entree to his philosophy (as found in works like Poetic Diction and Saving the Appearances). I will now be reading both of these in their turn. Barfield is wacky, as befits an Anglican follower of Rudolph Steiner, but his ideas fascinated me here, and reminded me what a fearsome responsibility I take on when I lift, from its stream-bed of long-dead peoples’ memories and associations and (Barfield’s favorite word) perceptions, and place into the sentence, the merest word.
I’ve seen these at dozens of libraries, cooling their fat heels in the seldom-visited A section, accompanied by The Year Book and The Table Book. Like the dusty lawbooks that exist mainly to reassure TV lawyers’ clients that TV lawyers are, indeed, TV lawyers, they seem more like furniture than like matter intended for someone to read. And that very quality in an old-enough tome tends to make me want to crack it open. What was it about this giant slab of text that made certain people once feel that they needed it in a room as surely as a couch needs cushions?
It turns out that The Every-Day Book is a rich project with a fascinating history. It wasn’t made to be read straight through, nor did I come anywhere near doing so; it belongs to that genre of book that ought to be live in guestrooms or on the tank of a toilet, where it’s always available as emergency relief for boredom. (Of course, boredom. What did you think I was gonna say?) It’s a Victorian-era miscellany, in the style of Schott’s Miscellany or The World Almanac, arranged on a calendar model—the first entry is for January, with each succeeding entry devoted to a day of the year.
I began at the beginning. January is named after Janus—it is a time for facing both ways, for planning, regretting, remembering and resolving. It’s also cold as hell, which is why the Anglo-Saxons called it “The Wolf Month.” January 1 is the feast of Fulgentius, who “sometimes went barefoot, never undressed to take rest, nor ate flesh meat.” On January 1 “congratulations, presents, and visits were made” by the ancient Romans, while druids used to cut mistletoe with a golden knife and hand it out to their friends. Charles Lamb once wrote of January 1: “Of all sounds of all bells—(bells, the music nighest bordering upon heaven)—most solemn and touching is the peal which rings out the Old Year. I never hear it without a gathering-up of my mind to a concentration of all the images that have been diffused over the past twelvemonth; all I have done or suffered, performed or neglected–in that regretted time.”
This was all super-interesting, and I had visions of myself plowing through both volumes and moving on to The Table Book, etc., enjoying a vast increase in my powers of conversation as I did so. “I hate Tuesdays,” a friend would say, and I’d reply, “But this is Tuesday the 24th of May! On this day, in 685, Wyglf the King of the Jutes became the first man in England to wear shoelaces!” And my friend would walk away with spirits repaired by the knowledge that, however dull his or her daily grind, at least it takes place within History, adjacent at all times to a multiplicity of wonderful facts.
This feeling wore off after about twelve minutes.
Still, the Every-Day Book is a remarkable thing, a Wikipedia compiled by one man, William Hone, and if I run across one at an antique store I will certainly be picking it up. Meanwhile, the name “William Hone” had rung a bell. As I read the “About the Author” note, I flashed back to early 2012, when I read A Budget of Paradoxes, by the great British logician Augustus DeMorgan. He had referred several times to a famous English early-nineteenth century free-speech case. Yup: same guy. I’ll let DeMorgan summarize:
The results of Hone’s trials (William Hone, 1779-1842) are among the important constitutional victories of our century. He published parodies on the Creeds, the Lord’s Prayer, the Catechism, etc., with intent to bring the Ministry into contempt: everybody knew that was his purpose. The Government indicted him for impious, profane, blasphemous intent, but not for seditious intent. They hoped to wear him out by proceeding day by day. December 18, 1817, they hid themselves under the Lord’s Prayer, the Creed, and the Commandments; December 19, under the Litany; December 20, under the Athanasian Creed, an odd place for shelter when they could not find it in the previous places. Hone defended himself for six, seven, and eight hours on the several days: and the jury acquitted him in 15, 105, and 20 minutes.
His cases didn’t actually end blasphemy laws in England, but they apparently helped to discredit them in the public eye: everybody knew the prosecutions were really political, because the jokes in Hone’s travesties of the various creeds were aimed, in every case, at misbehaving government ministers, not at Christian doctrine (though Hone was at this point a religious skeptic). He attracted great public support. The Day-Books were written much later, in the late 1820s, and intended as cash cows, to save Hone from debtor’s prison. (They didn’t work.) That institution is an irredeemably evil one no matter who falls into its clutches, but Hone in particular deserved better treatment from the world. This guy helped invent investigative journalism, saved the reputation (though not, alas, the life) of a woman falsely accused of poisoning her boss, fought for universal male suffrage, advocated for the reform of insane asylums, made fun of political corruption, and generally behaved like a nineteenth-century Noam Chomsky. Another late product of his literary life was The Apocryphal New Testament, one of the first examples of a still-lucrative anthology genre. By the time of the Every-Day Book he had returned to the relatively devout sort of Anglicanism in which he’d been raised, and that book is full of quotations from books of saints’ lives. From the sound of things, that’s a genre to which any biography of Hone would have to be assigned—no matter how you define “saint.”
(As for those stupid blasphemy laws themselves: they were still in effect in 1977, when a publication called Gay News got itself hauled before the bench over a dirty poem involving my savior. Jesus has survived far worse things than being the subject of someone’s pornographic fan fiction, but the judge, unbelievably, didn’t see it that way, and Gay News lost both the case and the appeals. The suit was brought by Mary Whitehouse, who was already on my shitlist for her campaign against “Doctor Who,” which she called, famously, “Teatime brutality for tots.” The laws were finally repealed in 2008. We live in a strange world.)